Mar 13 01:11:01.644526 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 01:11:02.350080 master-0 kubenswrapper[4055]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:11:02.351458 master-0 kubenswrapper[4055]: I0313 01:11:02.351124 4055 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 01:11:02.361783 master-0 kubenswrapper[4055]: W0313 01:11:02.361678 4055 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:11:02.361783 master-0 kubenswrapper[4055]: W0313 01:11:02.361744 4055 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:11:02.361783 master-0 kubenswrapper[4055]: W0313 01:11:02.361761 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:11:02.361783 master-0 kubenswrapper[4055]: W0313 01:11:02.361776 4055 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:11:02.361783 master-0 kubenswrapper[4055]: W0313 01:11:02.361788 4055 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361799 4055 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361810 4055 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361831 4055 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361840 4055 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361852 4055 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361867 4055 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361880 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361890 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361908 4055 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361923 4055 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:11:02.362258 master-0 kubenswrapper[4055]: W0313 01:11:02.361936 4055 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362364 4055 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362398 4055 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362410 4055 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362422 4055 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362434 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362450 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362461 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362478 4055 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362492 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362503 4055 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362514 4055 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362525 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362535 4055 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362545 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362556 4055 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362571 4055 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362588 4055 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362599 4055 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:11:02.362982 master-0 kubenswrapper[4055]: W0313 01:11:02.362610 4055 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362621 4055 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362664 4055 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362677 4055 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362696 4055 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362707 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362718 4055 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362728 4055 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362739 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362748 4055 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362758 4055 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362769 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362780 4055 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362802 4055 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362818 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362833 4055 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362845 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362857 4055 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362868 4055 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:11:02.363994 master-0 kubenswrapper[4055]: W0313 01:11:02.362879 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362889 4055 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362900 4055 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362911 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362921 4055 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362931 4055 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362942 4055 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362952 4055 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362963 4055 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362973 4055 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362983 4055 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.362993 4055 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363002 4055 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363012 4055 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363023 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363033 4055 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363043 4055 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363053 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: W0313 01:11:02.363063 4055 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: I0313 01:11:02.364453 4055 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: I0313 01:11:02.364489 4055 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 01:11:02.365077 master-0 kubenswrapper[4055]: I0313 01:11:02.364528 4055 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364545 4055 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364563 4055 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364577 4055 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364594 4055 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364609 4055 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364622 4055 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364667 4055 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364683 4055 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364697 4055 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364709 4055 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364721 4055 flags.go:64] FLAG: --cgroup-root="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364733 4055 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364748 4055 flags.go:64] FLAG: --client-ca-file="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364760 4055 flags.go:64] FLAG: --cloud-config="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364771 4055 flags.go:64] FLAG: --cloud-provider="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364783 4055 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364808 4055 flags.go:64] FLAG: --cluster-domain="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364820 4055 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364833 4055 flags.go:64] FLAG: --config-dir="" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364844 4055 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364857 4055 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364873 4055 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364884 4055 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 01:11:02.366434 master-0 kubenswrapper[4055]: I0313 01:11:02.364899 4055 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364911 4055 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364923 4055 flags.go:64] FLAG: --contention-profiling="false" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364934 4055 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364946 4055 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364959 4055 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364971 4055 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364987 4055 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.364999 4055 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365011 4055 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365022 4055 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365034 4055 flags.go:64] FLAG: --enable-server="true" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365045 4055 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365075 4055 flags.go:64] FLAG: --event-burst="100" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365089 4055 flags.go:64] FLAG: --event-qps="50" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365100 4055 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365112 4055 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365124 4055 flags.go:64] FLAG: --eviction-hard="" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365139 4055 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365151 4055 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365167 4055 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365183 4055 flags.go:64] FLAG: --eviction-soft="" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365196 4055 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365207 4055 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365219 4055 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 01:11:02.367999 master-0 kubenswrapper[4055]: I0313 01:11:02.365267 4055 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365280 4055 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365292 4055 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365303 4055 flags.go:64] FLAG: --feature-gates="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365318 4055 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365330 4055 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365343 4055 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365355 4055 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365366 4055 flags.go:64] FLAG: --healthz-port="10248" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365379 4055 flags.go:64] FLAG: --help="false" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365390 4055 flags.go:64] FLAG: --hostname-override="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365401 4055 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365413 4055 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365425 4055 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365436 4055 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365447 4055 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365460 4055 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365471 4055 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365483 4055 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365494 4055 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365506 4055 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365519 4055 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365531 4055 flags.go:64] FLAG: --kube-reserved="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365543 4055 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365554 4055 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365566 4055 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 01:11:02.369910 master-0 kubenswrapper[4055]: I0313 01:11:02.365577 4055 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365593 4055 flags.go:64] FLAG: --lock-file="" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365605 4055 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365617 4055 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365629 4055 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365685 4055 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365699 4055 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365711 4055 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365722 4055 flags.go:64] FLAG: --logging-format="text" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365734 4055 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365747 4055 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365758 4055 flags.go:64] FLAG: --manifest-url="" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365770 4055 flags.go:64] FLAG: --manifest-url-header="" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365787 4055 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365800 4055 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365815 4055 flags.go:64] FLAG: --max-pods="110" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365827 4055 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365839 4055 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365851 4055 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365863 4055 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365875 4055 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365887 4055 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365900 4055 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365935 4055 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 01:11:02.371101 master-0 kubenswrapper[4055]: I0313 01:11:02.365946 4055 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.365959 4055 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.365971 4055 flags.go:64] FLAG: --pod-cidr="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.365983 4055 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366000 4055 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366012 4055 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366024 4055 flags.go:64] FLAG: --pods-per-core="0" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366038 4055 flags.go:64] FLAG: --port="10250" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366051 4055 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366062 4055 flags.go:64] FLAG: --provider-id="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366076 4055 flags.go:64] FLAG: --qos-reserved="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366089 4055 flags.go:64] FLAG: --read-only-port="10255" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366101 4055 flags.go:64] FLAG: --register-node="true" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366113 4055 flags.go:64] FLAG: --register-schedulable="true" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366125 4055 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366148 4055 flags.go:64] FLAG: --registry-burst="10" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366160 4055 flags.go:64] FLAG: --registry-qps="5" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366174 4055 flags.go:64] FLAG: --reserved-cpus="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366187 4055 flags.go:64] FLAG: --reserved-memory="" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366202 4055 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366214 4055 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366226 4055 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366238 4055 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366250 4055 flags.go:64] FLAG: --runonce="false" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366263 4055 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 01:11:02.372253 master-0 kubenswrapper[4055]: I0313 01:11:02.366275 4055 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366287 4055 flags.go:64] FLAG: --seccomp-default="false" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366298 4055 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366310 4055 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366322 4055 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366334 4055 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366345 4055 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366357 4055 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366368 4055 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366380 4055 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366391 4055 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366403 4055 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366415 4055 flags.go:64] FLAG: --system-cgroups="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366426 4055 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366447 4055 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366459 4055 flags.go:64] FLAG: --tls-cert-file="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366470 4055 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366497 4055 flags.go:64] FLAG: --tls-min-version="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366516 4055 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366529 4055 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366540 4055 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366551 4055 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366563 4055 flags.go:64] FLAG: --v="2" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366579 4055 flags.go:64] FLAG: --version="false" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366594 4055 flags.go:64] FLAG: --vmodule="" Mar 13 01:11:02.373483 master-0 kubenswrapper[4055]: I0313 01:11:02.366609 4055 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: I0313 01:11:02.366621 4055 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.366950 4055 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.366974 4055 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.366989 4055 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367001 4055 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367012 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367023 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367034 4055 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367044 4055 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367056 4055 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367067 4055 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367078 4055 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367089 4055 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367099 4055 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367109 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367119 4055 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367130 4055 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367140 4055 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367149 4055 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:11:02.374591 master-0 kubenswrapper[4055]: W0313 01:11:02.367159 4055 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367170 4055 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367181 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367191 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367205 4055 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367222 4055 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367235 4055 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367246 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367256 4055 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367267 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367278 4055 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367288 4055 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367299 4055 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367309 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367320 4055 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367330 4055 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367344 4055 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367357 4055 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367367 4055 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:11:02.375511 master-0 kubenswrapper[4055]: W0313 01:11:02.367379 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367390 4055 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367400 4055 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367410 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367420 4055 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367429 4055 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367439 4055 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367449 4055 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367460 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367473 4055 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367487 4055 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367498 4055 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367510 4055 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367522 4055 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367533 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367543 4055 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367553 4055 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367564 4055 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367578 4055 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367588 4055 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:11:02.376431 master-0 kubenswrapper[4055]: W0313 01:11:02.367598 4055 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367608 4055 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367618 4055 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367629 4055 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367678 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367688 4055 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367698 4055 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367708 4055 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367719 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367729 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367740 4055 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367750 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367760 4055 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367771 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: W0313 01:11:02.367781 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:11:02.377360 master-0 kubenswrapper[4055]: I0313 01:11:02.367815 4055 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:11:02.381048 master-0 kubenswrapper[4055]: I0313 01:11:02.380988 4055 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 01:11:02.381048 master-0 kubenswrapper[4055]: I0313 01:11:02.381032 4055 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381165 4055 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381178 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381187 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381196 4055 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381205 4055 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:11:02.381207 master-0 kubenswrapper[4055]: W0313 01:11:02.381213 4055 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381223 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381231 4055 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381240 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381249 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381257 4055 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381265 4055 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381273 4055 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381281 4055 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381292 4055 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381304 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381313 4055 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381324 4055 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381334 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381377 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381387 4055 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381396 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381404 4055 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381413 4055 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:11:02.381502 master-0 kubenswrapper[4055]: W0313 01:11:02.381421 4055 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381430 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381438 4055 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381446 4055 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381454 4055 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381463 4055 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381470 4055 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381479 4055 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381487 4055 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381495 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381511 4055 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381520 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381529 4055 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381538 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381547 4055 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381555 4055 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381563 4055 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381571 4055 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381579 4055 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381587 4055 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:11:02.382353 master-0 kubenswrapper[4055]: W0313 01:11:02.381595 4055 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381603 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381611 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381618 4055 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381626 4055 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381634 4055 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381672 4055 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381683 4055 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381710 4055 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381719 4055 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381727 4055 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381735 4055 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381743 4055 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381750 4055 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381760 4055 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381771 4055 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381780 4055 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381790 4055 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381801 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381809 4055 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:11:02.383893 master-0 kubenswrapper[4055]: W0313 01:11:02.381818 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381827 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381835 4055 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381843 4055 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381851 4055 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381861 4055 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381871 4055 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.381879 4055 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: I0313 01:11:02.381892 4055 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382127 4055 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382138 4055 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382148 4055 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382156 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382164 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382172 4055 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:11:02.384939 master-0 kubenswrapper[4055]: W0313 01:11:02.382181 4055 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382188 4055 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382197 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382205 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382212 4055 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382220 4055 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382228 4055 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382236 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382244 4055 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382252 4055 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382260 4055 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382268 4055 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382276 4055 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382283 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382291 4055 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382299 4055 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382308 4055 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382317 4055 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382327 4055 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382336 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:11:02.385628 master-0 kubenswrapper[4055]: W0313 01:11:02.382344 4055 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382352 4055 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382360 4055 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382369 4055 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382377 4055 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382385 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382393 4055 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382416 4055 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382424 4055 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382431 4055 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382439 4055 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382449 4055 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382459 4055 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382467 4055 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382475 4055 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382483 4055 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382491 4055 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382498 4055 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382507 4055 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:11:02.386542 master-0 kubenswrapper[4055]: W0313 01:11:02.382514 4055 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382523 4055 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382531 4055 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382538 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382546 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382554 4055 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382561 4055 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382569 4055 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382577 4055 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382584 4055 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382592 4055 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382600 4055 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382607 4055 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382618 4055 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382627 4055 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382668 4055 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382679 4055 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382689 4055 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382699 4055 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:11:02.387427 master-0 kubenswrapper[4055]: W0313 01:11:02.382706 4055 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382714 4055 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382722 4055 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382732 4055 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382743 4055 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382754 4055 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382763 4055 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: W0313 01:11:02.382771 4055 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: I0313 01:11:02.382783 4055 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:11:02.388302 master-0 kubenswrapper[4055]: I0313 01:11:02.384799 4055 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 01:11:02.388846 master-0 kubenswrapper[4055]: I0313 01:11:02.388795 4055 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 01:11:02.390278 master-0 kubenswrapper[4055]: I0313 01:11:02.390235 4055 server.go:997] "Starting client certificate rotation" Mar 13 01:11:02.390278 master-0 kubenswrapper[4055]: I0313 01:11:02.390278 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 01:11:02.390528 master-0 kubenswrapper[4055]: I0313 01:11:02.390472 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 01:11:02.419325 master-0 kubenswrapper[4055]: I0313 01:11:02.419220 4055 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:11:02.422302 master-0 kubenswrapper[4055]: I0313 01:11:02.422237 4055 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:11:02.425694 master-0 kubenswrapper[4055]: E0313 01:11:02.425572 4055 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:02.441226 master-0 kubenswrapper[4055]: I0313 01:11:02.441155 4055 log.go:25] "Validated CRI v1 runtime API" Mar 13 01:11:02.447553 master-0 kubenswrapper[4055]: I0313 01:11:02.447488 4055 log.go:25] "Validated CRI v1 image API" Mar 13 01:11:02.450974 master-0 kubenswrapper[4055]: I0313 01:11:02.450922 4055 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 01:11:02.458489 master-0 kubenswrapper[4055]: I0313 01:11:02.458407 4055 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 93b4eca9-1357-4499-ad5f-ae90bf0d6f4a:/dev/vda3] Mar 13 01:11:02.458579 master-0 kubenswrapper[4055]: I0313 01:11:02.458480 4055 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 13 01:11:02.483973 master-0 kubenswrapper[4055]: I0313 01:11:02.483533 4055 manager.go:217] Machine: {Timestamp:2026-03-13 01:11:02.481943817 +0000 UTC m=+0.645002925 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d267e942efb840478924173af659d9c8 SystemUUID:d267e942-efb8-4047-8924-173af659d9c8 BootID:beebd46b-80cb-4497-a098-674e9838eb1c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:7e:ba:68 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e0:00:01 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:26:df:a4:ac:f6:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 01:11:02.483973 master-0 kubenswrapper[4055]: I0313 01:11:02.483912 4055 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 01:11:02.484246 master-0 kubenswrapper[4055]: I0313 01:11:02.484073 4055 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 01:11:02.484527 master-0 kubenswrapper[4055]: I0313 01:11:02.484482 4055 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 01:11:02.484894 master-0 kubenswrapper[4055]: I0313 01:11:02.484824 4055 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 01:11:02.485215 master-0 kubenswrapper[4055]: I0313 01:11:02.484885 4055 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 01:11:02.486189 master-0 kubenswrapper[4055]: I0313 01:11:02.486148 4055 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 01:11:02.486189 master-0 kubenswrapper[4055]: I0313 01:11:02.486181 4055 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 01:11:02.486307 master-0 kubenswrapper[4055]: I0313 01:11:02.486251 4055 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:11:02.486307 master-0 kubenswrapper[4055]: I0313 01:11:02.486288 4055 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:11:02.486495 master-0 kubenswrapper[4055]: I0313 01:11:02.486454 4055 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:11:02.486665 master-0 kubenswrapper[4055]: I0313 01:11:02.486599 4055 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 01:11:02.492332 master-0 kubenswrapper[4055]: I0313 01:11:02.492290 4055 kubelet.go:418] "Attempting to sync node with API server" Mar 13 01:11:02.492332 master-0 kubenswrapper[4055]: I0313 01:11:02.492328 4055 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 01:11:02.492475 master-0 kubenswrapper[4055]: I0313 01:11:02.492375 4055 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 01:11:02.492475 master-0 kubenswrapper[4055]: I0313 01:11:02.492395 4055 kubelet.go:324] "Adding apiserver pod source" Mar 13 01:11:02.492475 master-0 kubenswrapper[4055]: I0313 01:11:02.492419 4055 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 01:11:02.499523 master-0 kubenswrapper[4055]: I0313 01:11:02.499468 4055 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 01:11:02.502554 master-0 kubenswrapper[4055]: W0313 01:11:02.502456 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:02.502554 master-0 kubenswrapper[4055]: W0313 01:11:02.502475 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:02.502748 master-0 kubenswrapper[4055]: E0313 01:11:02.502588 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:02.502748 master-0 kubenswrapper[4055]: E0313 01:11:02.502604 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:02.503307 master-0 kubenswrapper[4055]: I0313 01:11:02.503257 4055 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 01:11:02.503687 master-0 kubenswrapper[4055]: I0313 01:11:02.503596 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 01:11:02.503687 master-0 kubenswrapper[4055]: I0313 01:11:02.503667 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 01:11:02.503687 master-0 kubenswrapper[4055]: I0313 01:11:02.503688 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503708 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503727 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503746 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503762 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503776 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503791 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503804 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 01:11:02.504057 master-0 kubenswrapper[4055]: I0313 01:11:02.503847 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 01:11:02.504456 master-0 kubenswrapper[4055]: I0313 01:11:02.504095 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 01:11:02.506582 master-0 kubenswrapper[4055]: I0313 01:11:02.506533 4055 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 01:11:02.507299 master-0 kubenswrapper[4055]: I0313 01:11:02.507258 4055 server.go:1280] "Started kubelet" Mar 13 01:11:02.509039 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 01:11:02.509352 master-0 kubenswrapper[4055]: I0313 01:11:02.509088 4055 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 01:11:02.509352 master-0 kubenswrapper[4055]: I0313 01:11:02.509096 4055 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 01:11:02.513812 master-0 kubenswrapper[4055]: I0313 01:11:02.513746 4055 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 01:11:02.514518 master-0 kubenswrapper[4055]: I0313 01:11:02.514456 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:02.514738 master-0 kubenswrapper[4055]: I0313 01:11:02.514683 4055 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 01:11:02.517961 master-0 kubenswrapper[4055]: I0313 01:11:02.517903 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 01:11:02.518037 master-0 kubenswrapper[4055]: I0313 01:11:02.517970 4055 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 01:11:02.518194 master-0 kubenswrapper[4055]: E0313 01:11:02.518136 4055 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 01:11:02.518275 master-0 kubenswrapper[4055]: I0313 01:11:02.518242 4055 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 01:11:02.518334 master-0 kubenswrapper[4055]: I0313 01:11:02.518313 4055 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 01:11:02.518526 master-0 kubenswrapper[4055]: I0313 01:11:02.518474 4055 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 01:11:02.521240 master-0 kubenswrapper[4055]: I0313 01:11:02.519869 4055 reconstruct.go:97] "Volume reconstruction finished" Mar 13 01:11:02.521240 master-0 kubenswrapper[4055]: I0313 01:11:02.519907 4055 reconciler.go:26] "Reconciler: start to sync state" Mar 13 01:11:02.521240 master-0 kubenswrapper[4055]: E0313 01:11:02.518537 4055 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c4166ac1d8706 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,LastTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:02.521913 master-0 kubenswrapper[4055]: W0313 01:11:02.521662 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:02.521913 master-0 kubenswrapper[4055]: E0313 01:11:02.521829 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:02.522101 master-0 kubenswrapper[4055]: E0313 01:11:02.521998 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 01:11:02.523613 master-0 kubenswrapper[4055]: I0313 01:11:02.523297 4055 server.go:449] "Adding debug handlers to kubelet server" Mar 13 01:11:02.523960 master-0 kubenswrapper[4055]: I0313 01:11:02.523896 4055 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 01:11:02.523960 master-0 kubenswrapper[4055]: I0313 01:11:02.523940 4055 factory.go:55] Registering systemd factory Mar 13 01:11:02.523960 master-0 kubenswrapper[4055]: I0313 01:11:02.523956 4055 factory.go:221] Registration of the systemd container factory successfully Mar 13 01:11:02.529671 master-0 kubenswrapper[4055]: I0313 01:11:02.529603 4055 factory.go:153] Registering CRI-O factory Mar 13 01:11:02.529788 master-0 kubenswrapper[4055]: I0313 01:11:02.529672 4055 factory.go:221] Registration of the crio container factory successfully Mar 13 01:11:02.529788 master-0 kubenswrapper[4055]: I0313 01:11:02.529765 4055 factory.go:103] Registering Raw factory Mar 13 01:11:02.529894 master-0 kubenswrapper[4055]: I0313 01:11:02.529794 4055 manager.go:1196] Started watching for new ooms in manager Mar 13 01:11:02.531433 master-0 kubenswrapper[4055]: I0313 01:11:02.531397 4055 manager.go:319] Starting recovery of all containers Mar 13 01:11:02.535787 master-0 kubenswrapper[4055]: E0313 01:11:02.535734 4055 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 13 01:11:02.558306 master-0 kubenswrapper[4055]: I0313 01:11:02.558227 4055 manager.go:324] Recovery completed Mar 13 01:11:02.576984 master-0 kubenswrapper[4055]: I0313 01:11:02.576945 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.580106 master-0 kubenswrapper[4055]: I0313 01:11:02.580039 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.580106 master-0 kubenswrapper[4055]: I0313 01:11:02.580110 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.580272 master-0 kubenswrapper[4055]: I0313 01:11:02.580129 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.581502 master-0 kubenswrapper[4055]: I0313 01:11:02.581465 4055 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 01:11:02.581611 master-0 kubenswrapper[4055]: I0313 01:11:02.581506 4055 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 01:11:02.581611 master-0 kubenswrapper[4055]: I0313 01:11:02.581550 4055 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:11:02.586693 master-0 kubenswrapper[4055]: I0313 01:11:02.586629 4055 policy_none.go:49] "None policy: Start" Mar 13 01:11:02.587579 master-0 kubenswrapper[4055]: I0313 01:11:02.587549 4055 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 01:11:02.587851 master-0 kubenswrapper[4055]: I0313 01:11:02.587823 4055 state_mem.go:35] "Initializing new in-memory state store" Mar 13 01:11:02.618926 master-0 kubenswrapper[4055]: E0313 01:11:02.618853 4055 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.664410 4055 manager.go:334] "Starting Device Plugin manager" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.664560 4055 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.664582 4055 server.go:79] "Starting device plugin registration server" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.664996 4055 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.665020 4055 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.666473 4055 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.666726 4055 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.666782 4055 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: E0313 01:11:02.667288 4055 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.674628 4055 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.678128 4055 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.678201 4055 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: I0313 01:11:02.678229 4055 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 01:11:02.678788 master-0 kubenswrapper[4055]: E0313 01:11:02.678293 4055 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 13 01:11:02.680090 master-0 kubenswrapper[4055]: W0313 01:11:02.679889 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:02.680090 master-0 kubenswrapper[4055]: E0313 01:11:02.679962 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:02.724318 master-0 kubenswrapper[4055]: E0313 01:11:02.723933 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 01:11:02.766080 master-0 kubenswrapper[4055]: I0313 01:11:02.766013 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.767686 master-0 kubenswrapper[4055]: I0313 01:11:02.767626 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.767686 master-0 kubenswrapper[4055]: I0313 01:11:02.767682 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.767889 master-0 kubenswrapper[4055]: I0313 01:11:02.767695 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.767889 master-0 kubenswrapper[4055]: I0313 01:11:02.767727 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:02.768666 master-0 kubenswrapper[4055]: E0313 01:11:02.768569 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:02.778704 master-0 kubenswrapper[4055]: I0313 01:11:02.778625 4055 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:11:02.778823 master-0 kubenswrapper[4055]: I0313 01:11:02.778716 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.779798 master-0 kubenswrapper[4055]: I0313 01:11:02.779756 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.779857 master-0 kubenswrapper[4055]: I0313 01:11:02.779811 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.779857 master-0 kubenswrapper[4055]: I0313 01:11:02.779829 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.780051 master-0 kubenswrapper[4055]: I0313 01:11:02.780018 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.781015 master-0 kubenswrapper[4055]: I0313 01:11:02.780985 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.781071 master-0 kubenswrapper[4055]: I0313 01:11:02.781058 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.781232 master-0 kubenswrapper[4055]: I0313 01:11:02.781190 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.781275 master-0 kubenswrapper[4055]: I0313 01:11:02.781237 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.781275 master-0 kubenswrapper[4055]: I0313 01:11:02.781264 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.781517 master-0 kubenswrapper[4055]: I0313 01:11:02.781485 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.781702 master-0 kubenswrapper[4055]: I0313 01:11:02.781686 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.781754 master-0 kubenswrapper[4055]: I0313 01:11:02.781726 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.782137 master-0 kubenswrapper[4055]: I0313 01:11:02.782083 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.782202 master-0 kubenswrapper[4055]: I0313 01:11:02.782161 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.782202 master-0 kubenswrapper[4055]: I0313 01:11:02.782179 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.782755 master-0 kubenswrapper[4055]: I0313 01:11:02.782711 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.782755 master-0 kubenswrapper[4055]: I0313 01:11:02.782755 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.782882 master-0 kubenswrapper[4055]: I0313 01:11:02.782771 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.782882 master-0 kubenswrapper[4055]: I0313 01:11:02.782846 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.782882 master-0 kubenswrapper[4055]: I0313 01:11:02.782882 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.782983 master-0 kubenswrapper[4055]: I0313 01:11:02.782888 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.782983 master-0 kubenswrapper[4055]: I0313 01:11:02.782898 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.783127 master-0 kubenswrapper[4055]: I0313 01:11:02.783097 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.783178 master-0 kubenswrapper[4055]: I0313 01:11:02.783138 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.783808 master-0 kubenswrapper[4055]: I0313 01:11:02.783771 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.783808 master-0 kubenswrapper[4055]: I0313 01:11:02.783785 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.783808 master-0 kubenswrapper[4055]: I0313 01:11:02.783800 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.783984 master-0 kubenswrapper[4055]: I0313 01:11:02.783820 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.783984 master-0 kubenswrapper[4055]: I0313 01:11:02.783801 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.783984 master-0 kubenswrapper[4055]: I0313 01:11:02.783901 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.784147 master-0 kubenswrapper[4055]: I0313 01:11:02.784120 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.784330 master-0 kubenswrapper[4055]: I0313 01:11:02.784294 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.784403 master-0 kubenswrapper[4055]: I0313 01:11:02.784334 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.785092 master-0 kubenswrapper[4055]: I0313 01:11:02.785053 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.785092 master-0 kubenswrapper[4055]: I0313 01:11:02.785081 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.785235 master-0 kubenswrapper[4055]: I0313 01:11:02.785096 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.785235 master-0 kubenswrapper[4055]: I0313 01:11:02.785161 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.785235 master-0 kubenswrapper[4055]: I0313 01:11:02.785178 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.785235 master-0 kubenswrapper[4055]: I0313 01:11:02.785191 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.785440 master-0 kubenswrapper[4055]: I0313 01:11:02.785327 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.785440 master-0 kubenswrapper[4055]: I0313 01:11:02.785354 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.786114 master-0 kubenswrapper[4055]: I0313 01:11:02.786077 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.786114 master-0 kubenswrapper[4055]: I0313 01:11:02.786109 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.786248 master-0 kubenswrapper[4055]: I0313 01:11:02.786123 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.822184 master-0 kubenswrapper[4055]: I0313 01:11:02.822153 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.822292 master-0 kubenswrapper[4055]: I0313 01:11:02.822202 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.822292 master-0 kubenswrapper[4055]: I0313 01:11:02.822237 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.822292 master-0 kubenswrapper[4055]: I0313 01:11:02.822274 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.822552 master-0 kubenswrapper[4055]: I0313 01:11:02.822305 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.822552 master-0 kubenswrapper[4055]: I0313 01:11:02.822338 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.822552 master-0 kubenswrapper[4055]: I0313 01:11:02.822370 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.822552 master-0 kubenswrapper[4055]: I0313 01:11:02.822400 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.822552 master-0 kubenswrapper[4055]: I0313 01:11:02.822456 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.822924 master-0 kubenswrapper[4055]: I0313 01:11:02.822719 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.822924 master-0 kubenswrapper[4055]: I0313 01:11:02.822830 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.823059 master-0 kubenswrapper[4055]: I0313 01:11:02.822898 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.823059 master-0 kubenswrapper[4055]: I0313 01:11:02.823035 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.823182 master-0 kubenswrapper[4055]: I0313 01:11:02.823107 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.823182 master-0 kubenswrapper[4055]: I0313 01:11:02.823139 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.823182 master-0 kubenswrapper[4055]: I0313 01:11:02.823170 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.823604 master-0 kubenswrapper[4055]: I0313 01:11:02.823545 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.924790 master-0 kubenswrapper[4055]: I0313 01:11:02.924683 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.924790 master-0 kubenswrapper[4055]: I0313 01:11:02.924767 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.924816 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.924890 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.924914 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.925014 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.925089 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.925133 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925162 master-0 kubenswrapper[4055]: I0313 01:11:02.925168 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925216 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925262 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925331 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925385 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925438 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925480 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925507 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925524 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925547 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925580 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925606 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925604 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925680 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.925862 master-0 kubenswrapper[4055]: I0313 01:11:02.925744 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.925865 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.925876 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.925915 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.925953 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.925983 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926001 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926061 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926106 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926137 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926180 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:02.926788 master-0 kubenswrapper[4055]: I0313 01:11:02.926179 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:02.969135 master-0 kubenswrapper[4055]: I0313 01:11:02.969056 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:02.970453 master-0 kubenswrapper[4055]: I0313 01:11:02.970399 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:02.970551 master-0 kubenswrapper[4055]: I0313 01:11:02.970457 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:02.970551 master-0 kubenswrapper[4055]: I0313 01:11:02.970479 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:02.970551 master-0 kubenswrapper[4055]: I0313 01:11:02.970543 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:02.971710 master-0 kubenswrapper[4055]: E0313 01:11:02.971615 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:03.115037 master-0 kubenswrapper[4055]: I0313 01:11:03.114884 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:11:03.125540 master-0 kubenswrapper[4055]: E0313 01:11:03.125470 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 01:11:03.144998 master-0 kubenswrapper[4055]: I0313 01:11:03.144922 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:11:03.160011 master-0 kubenswrapper[4055]: I0313 01:11:03.159931 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:03.187367 master-0 kubenswrapper[4055]: I0313 01:11:03.187293 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:03.193290 master-0 kubenswrapper[4055]: I0313 01:11:03.193218 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:11:03.372364 master-0 kubenswrapper[4055]: I0313 01:11:03.372203 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:03.373808 master-0 kubenswrapper[4055]: I0313 01:11:03.373746 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:03.373938 master-0 kubenswrapper[4055]: I0313 01:11:03.373818 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:03.373938 master-0 kubenswrapper[4055]: I0313 01:11:03.373842 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:03.373938 master-0 kubenswrapper[4055]: I0313 01:11:03.373918 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:03.375290 master-0 kubenswrapper[4055]: E0313 01:11:03.375233 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:03.516080 master-0 kubenswrapper[4055]: I0313 01:11:03.515966 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:03.744402 master-0 kubenswrapper[4055]: W0313 01:11:03.744071 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3 WatchSource:0}: Error finding container 5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3: Status 404 returned error can't find the container with id 5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3 Mar 13 01:11:03.755876 master-0 kubenswrapper[4055]: I0313 01:11:03.755826 4055 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:11:03.777890 master-0 kubenswrapper[4055]: W0313 01:11:03.777813 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b WatchSource:0}: Error finding container 331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b: Status 404 returned error can't find the container with id 331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b Mar 13 01:11:03.790900 master-0 kubenswrapper[4055]: W0313 01:11:03.790839 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e WatchSource:0}: Error finding container cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e: Status 404 returned error can't find the container with id cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e Mar 13 01:11:03.828881 master-0 kubenswrapper[4055]: W0313 01:11:03.828783 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:03.829047 master-0 kubenswrapper[4055]: E0313 01:11:03.828895 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:03.834204 master-0 kubenswrapper[4055]: W0313 01:11:03.834140 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e WatchSource:0}: Error finding container 9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e: Status 404 returned error can't find the container with id 9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e Mar 13 01:11:03.858590 master-0 kubenswrapper[4055]: W0313 01:11:03.858524 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df WatchSource:0}: Error finding container 89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df: Status 404 returned error can't find the container with id 89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df Mar 13 01:11:03.927126 master-0 kubenswrapper[4055]: E0313 01:11:03.927023 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 01:11:03.981922 master-0 kubenswrapper[4055]: W0313 01:11:03.981732 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:03.981922 master-0 kubenswrapper[4055]: E0313 01:11:03.981861 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:04.015021 master-0 kubenswrapper[4055]: W0313 01:11:04.014890 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:04.015113 master-0 kubenswrapper[4055]: E0313 01:11:04.015017 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:04.044071 master-0 kubenswrapper[4055]: W0313 01:11:04.043950 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:04.044071 master-0 kubenswrapper[4055]: E0313 01:11:04.044058 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:04.176165 master-0 kubenswrapper[4055]: I0313 01:11:04.176084 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:04.177461 master-0 kubenswrapper[4055]: I0313 01:11:04.177395 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:04.177461 master-0 kubenswrapper[4055]: I0313 01:11:04.177452 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:04.177461 master-0 kubenswrapper[4055]: I0313 01:11:04.177469 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:04.177731 master-0 kubenswrapper[4055]: I0313 01:11:04.177525 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:04.178678 master-0 kubenswrapper[4055]: E0313 01:11:04.178558 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:04.435019 master-0 kubenswrapper[4055]: I0313 01:11:04.434976 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 01:11:04.436265 master-0 kubenswrapper[4055]: E0313 01:11:04.436235 4055 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:04.515773 master-0 kubenswrapper[4055]: I0313 01:11:04.515742 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:04.684266 master-0 kubenswrapper[4055]: I0313 01:11:04.684174 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df"} Mar 13 01:11:04.685220 master-0 kubenswrapper[4055]: I0313 01:11:04.685168 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e"} Mar 13 01:11:04.685904 master-0 kubenswrapper[4055]: I0313 01:11:04.685874 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e"} Mar 13 01:11:04.686790 master-0 kubenswrapper[4055]: I0313 01:11:04.686757 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b"} Mar 13 01:11:04.687779 master-0 kubenswrapper[4055]: I0313 01:11:04.687761 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3"} Mar 13 01:11:05.516878 master-0 kubenswrapper[4055]: I0313 01:11:05.516774 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:05.528522 master-0 kubenswrapper[4055]: E0313 01:11:05.528425 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 01:11:05.723170 master-0 kubenswrapper[4055]: W0313 01:11:05.723126 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:05.723170 master-0 kubenswrapper[4055]: E0313 01:11:05.723174 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:05.779767 master-0 kubenswrapper[4055]: I0313 01:11:05.779681 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:05.780691 master-0 kubenswrapper[4055]: I0313 01:11:05.780664 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:05.780739 master-0 kubenswrapper[4055]: I0313 01:11:05.780707 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:05.780739 master-0 kubenswrapper[4055]: I0313 01:11:05.780719 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:05.780795 master-0 kubenswrapper[4055]: I0313 01:11:05.780760 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:05.781655 master-0 kubenswrapper[4055]: E0313 01:11:05.781595 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:06.184047 master-0 kubenswrapper[4055]: W0313 01:11:06.183993 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:06.184203 master-0 kubenswrapper[4055]: E0313 01:11:06.184087 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:06.475743 master-0 kubenswrapper[4055]: W0313 01:11:06.475701 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:06.475815 master-0 kubenswrapper[4055]: E0313 01:11:06.475750 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:06.493884 master-0 kubenswrapper[4055]: E0313 01:11:06.493776 4055 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c4166ac1d8706 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,LastTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:06.516110 master-0 kubenswrapper[4055]: I0313 01:11:06.516070 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:06.693961 master-0 kubenswrapper[4055]: I0313 01:11:06.693915 4055 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4" exitCode=0 Mar 13 01:11:06.694409 master-0 kubenswrapper[4055]: I0313 01:11:06.693998 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4"} Mar 13 01:11:06.695829 master-0 kubenswrapper[4055]: I0313 01:11:06.695792 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a"} Mar 13 01:11:07.026722 master-0 kubenswrapper[4055]: W0313 01:11:07.026563 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:07.026722 master-0 kubenswrapper[4055]: E0313 01:11:07.026667 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:07.515656 master-0 kubenswrapper[4055]: I0313 01:11:07.515582 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:07.700967 master-0 kubenswrapper[4055]: I0313 01:11:07.700887 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059"} Mar 13 01:11:07.700967 master-0 kubenswrapper[4055]: I0313 01:11:07.700963 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:07.701728 master-0 kubenswrapper[4055]: I0313 01:11:07.700933 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:07.702250 master-0 kubenswrapper[4055]: I0313 01:11:07.702223 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:07.702344 master-0 kubenswrapper[4055]: I0313 01:11:07.702262 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:07.702344 master-0 kubenswrapper[4055]: I0313 01:11:07.702274 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:07.702344 master-0 kubenswrapper[4055]: I0313 01:11:07.702288 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:07.702344 master-0 kubenswrapper[4055]: I0313 01:11:07.702320 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:07.702344 master-0 kubenswrapper[4055]: I0313 01:11:07.702332 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:08.515819 master-0 kubenswrapper[4055]: I0313 01:11:08.515765 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:08.576581 master-0 kubenswrapper[4055]: I0313 01:11:08.576467 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 01:11:08.578014 master-0 kubenswrapper[4055]: E0313 01:11:08.577959 4055 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:08.704776 master-0 kubenswrapper[4055]: I0313 01:11:08.704733 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 13 01:11:08.705301 master-0 kubenswrapper[4055]: I0313 01:11:08.705249 4055 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e4728382c8a93cf22790c5d3518c37f9a3ebf03203c4caa3c739791f4f44fbea" exitCode=1 Mar 13 01:11:08.705422 master-0 kubenswrapper[4055]: I0313 01:11:08.705395 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:08.705706 master-0 kubenswrapper[4055]: I0313 01:11:08.705377 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e4728382c8a93cf22790c5d3518c37f9a3ebf03203c4caa3c739791f4f44fbea"} Mar 13 01:11:08.705951 master-0 kubenswrapper[4055]: I0313 01:11:08.705921 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:08.706435 master-0 kubenswrapper[4055]: I0313 01:11:08.706390 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:08.706475 master-0 kubenswrapper[4055]: I0313 01:11:08.706455 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:08.706505 master-0 kubenswrapper[4055]: I0313 01:11:08.706479 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:08.706692 master-0 kubenswrapper[4055]: I0313 01:11:08.706656 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:08.706736 master-0 kubenswrapper[4055]: I0313 01:11:08.706699 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:08.706736 master-0 kubenswrapper[4055]: I0313 01:11:08.706710 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:08.707094 master-0 kubenswrapper[4055]: I0313 01:11:08.707065 4055 scope.go:117] "RemoveContainer" containerID="e4728382c8a93cf22790c5d3518c37f9a3ebf03203c4caa3c739791f4f44fbea" Mar 13 01:11:08.730091 master-0 kubenswrapper[4055]: E0313 01:11:08.730011 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 13 01:11:08.982237 master-0 kubenswrapper[4055]: I0313 01:11:08.982149 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:08.983695 master-0 kubenswrapper[4055]: I0313 01:11:08.983617 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:08.983802 master-0 kubenswrapper[4055]: I0313 01:11:08.983709 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:08.983802 master-0 kubenswrapper[4055]: I0313 01:11:08.983730 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:08.983940 master-0 kubenswrapper[4055]: I0313 01:11:08.983883 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:08.985234 master-0 kubenswrapper[4055]: E0313 01:11:08.985165 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 01:11:09.518048 master-0 kubenswrapper[4055]: I0313 01:11:09.517906 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:10.046141 master-0 kubenswrapper[4055]: W0313 01:11:10.046044 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:10.046592 master-0 kubenswrapper[4055]: E0313 01:11:10.046171 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:10.516199 master-0 kubenswrapper[4055]: I0313 01:11:10.516103 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:11.033772 master-0 kubenswrapper[4055]: W0313 01:11:11.033659 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:11.033952 master-0 kubenswrapper[4055]: E0313 01:11:11.033786 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:11.516488 master-0 kubenswrapper[4055]: I0313 01:11:11.516109 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:11.712774 master-0 kubenswrapper[4055]: I0313 01:11:11.712661 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 01:11:11.713254 master-0 kubenswrapper[4055]: I0313 01:11:11.713208 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 13 01:11:11.713782 master-0 kubenswrapper[4055]: I0313 01:11:11.713708 4055 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="7269f2b718cb983fc5bf661b55ffd012dccc71af3ba230b7c7a56b051c9d7328" exitCode=1 Mar 13 01:11:11.713782 master-0 kubenswrapper[4055]: I0313 01:11:11.713739 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"7269f2b718cb983fc5bf661b55ffd012dccc71af3ba230b7c7a56b051c9d7328"} Mar 13 01:11:11.713987 master-0 kubenswrapper[4055]: I0313 01:11:11.713816 4055 scope.go:117] "RemoveContainer" containerID="e4728382c8a93cf22790c5d3518c37f9a3ebf03203c4caa3c739791f4f44fbea" Mar 13 01:11:11.713987 master-0 kubenswrapper[4055]: I0313 01:11:11.713880 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:11.714791 master-0 kubenswrapper[4055]: I0313 01:11:11.714728 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:11.714895 master-0 kubenswrapper[4055]: I0313 01:11:11.714801 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:11.714895 master-0 kubenswrapper[4055]: I0313 01:11:11.714825 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:11.715772 master-0 kubenswrapper[4055]: I0313 01:11:11.715578 4055 scope.go:117] "RemoveContainer" containerID="7269f2b718cb983fc5bf661b55ffd012dccc71af3ba230b7c7a56b051c9d7328" Mar 13 01:11:11.715870 master-0 kubenswrapper[4055]: I0313 01:11:11.715817 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9"} Mar 13 01:11:11.715870 master-0 kubenswrapper[4055]: E0313 01:11:11.715842 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 01:11:11.716041 master-0 kubenswrapper[4055]: I0313 01:11:11.715953 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:11.717030 master-0 kubenswrapper[4055]: I0313 01:11:11.716967 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:11.717140 master-0 kubenswrapper[4055]: I0313 01:11:11.717036 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:11.717140 master-0 kubenswrapper[4055]: I0313 01:11:11.717055 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:11.948883 master-0 kubenswrapper[4055]: W0313 01:11:11.948748 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:11.949089 master-0 kubenswrapper[4055]: E0313 01:11:11.948866 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:12.133512 master-0 kubenswrapper[4055]: W0313 01:11:12.133361 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:12.133512 master-0 kubenswrapper[4055]: E0313 01:11:12.133502 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 01:11:12.515404 master-0 kubenswrapper[4055]: I0313 01:11:12.515332 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:11:12.667912 master-0 kubenswrapper[4055]: E0313 01:11:12.667850 4055 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 01:11:12.719161 master-0 kubenswrapper[4055]: I0313 01:11:12.719112 4055 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186" exitCode=0 Mar 13 01:11:12.719296 master-0 kubenswrapper[4055]: I0313 01:11:12.719164 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186"} Mar 13 01:11:12.719296 master-0 kubenswrapper[4055]: I0313 01:11:12.719231 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:12.720448 master-0 kubenswrapper[4055]: I0313 01:11:12.720392 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:12.720530 master-0 kubenswrapper[4055]: I0313 01:11:12.720461 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:12.720530 master-0 kubenswrapper[4055]: I0313 01:11:12.720487 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:12.721820 master-0 kubenswrapper[4055]: I0313 01:11:12.721440 4055 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568" exitCode=1 Mar 13 01:11:12.721820 master-0 kubenswrapper[4055]: I0313 01:11:12.721528 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568"} Mar 13 01:11:12.726548 master-0 kubenswrapper[4055]: I0313 01:11:12.726492 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 01:11:12.727080 master-0 kubenswrapper[4055]: I0313 01:11:12.727044 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:12.727252 master-0 kubenswrapper[4055]: I0313 01:11:12.727224 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:12.727989 master-0 kubenswrapper[4055]: I0313 01:11:12.727956 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:12.727989 master-0 kubenswrapper[4055]: I0313 01:11:12.727987 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:12.728093 master-0 kubenswrapper[4055]: I0313 01:11:12.727997 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:12.728093 master-0 kubenswrapper[4055]: I0313 01:11:12.728002 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:12.728093 master-0 kubenswrapper[4055]: I0313 01:11:12.728030 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:12.728093 master-0 kubenswrapper[4055]: I0313 01:11:12.728041 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:13.732368 master-0 kubenswrapper[4055]: I0313 01:11:13.732315 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172"} Mar 13 01:11:14.390177 master-0 kubenswrapper[4055]: I0313 01:11:14.389393 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:14.520012 master-0 kubenswrapper[4055]: I0313 01:11:14.519948 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:14.737830 master-0 kubenswrapper[4055]: I0313 01:11:14.737768 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182"} Mar 13 01:11:14.738437 master-0 kubenswrapper[4055]: I0313 01:11:14.737892 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:14.738573 master-0 kubenswrapper[4055]: I0313 01:11:14.738544 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:14.738573 master-0 kubenswrapper[4055]: I0313 01:11:14.738570 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:14.738662 master-0 kubenswrapper[4055]: I0313 01:11:14.738582 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:14.738873 master-0 kubenswrapper[4055]: I0313 01:11:14.738849 4055 scope.go:117] "RemoveContainer" containerID="3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568" Mar 13 01:11:14.822666 master-0 kubenswrapper[4055]: I0313 01:11:14.815826 4055 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:15.135722 master-0 kubenswrapper[4055]: E0313 01:11:15.135560 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 01:11:15.385378 master-0 kubenswrapper[4055]: I0313 01:11:15.385325 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:15.390195 master-0 kubenswrapper[4055]: I0313 01:11:15.390087 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:15.390195 master-0 kubenswrapper[4055]: I0313 01:11:15.390151 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:15.390195 master-0 kubenswrapper[4055]: I0313 01:11:15.390171 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:15.390326 master-0 kubenswrapper[4055]: I0313 01:11:15.390244 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:15.409189 master-0 kubenswrapper[4055]: E0313 01:11:15.409152 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 01:11:15.521525 master-0 kubenswrapper[4055]: I0313 01:11:15.521456 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:15.742152 master-0 kubenswrapper[4055]: I0313 01:11:15.742070 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee"} Mar 13 01:11:15.742720 master-0 kubenswrapper[4055]: I0313 01:11:15.742185 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:15.743154 master-0 kubenswrapper[4055]: I0313 01:11:15.743106 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:15.743218 master-0 kubenswrapper[4055]: I0313 01:11:15.743155 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:15.743218 master-0 kubenswrapper[4055]: I0313 01:11:15.743173 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:16.502710 master-0 kubenswrapper[4055]: E0313 01:11:16.502479 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166ac1d8706 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,LastTimestamp:2026-03-13 01:11:02.507218694 +0000 UTC m=+0.670277772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.510313 master-0 kubenswrapper[4055]: E0313 01:11:16.510068 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.517558 master-0 kubenswrapper[4055]: I0313 01:11:16.517499 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:16.517999 master-0 kubenswrapper[4055]: E0313 01:11:16.517841 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.525142 master-0 kubenswrapper[4055]: E0313 01:11:16.524981 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.532396 master-0 kubenswrapper[4055]: E0313 01:11:16.532264 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b5d347a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.67012496 +0000 UTC m=+0.833184028,LastTimestamp:2026-03-13 01:11:02.67012496 +0000 UTC m=+0.833184028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.539090 master-0 kubenswrapper[4055]: E0313 01:11:16.538900 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.767666704 +0000 UTC m=+0.930725752,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.545967 master-0 kubenswrapper[4055]: E0313 01:11:16.545836 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.767690397 +0000 UTC m=+0.930749445,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.552904 master-0 kubenswrapper[4055]: E0313 01:11:16.552777 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.767701065 +0000 UTC m=+0.930760113,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.559807 master-0 kubenswrapper[4055]: E0313 01:11:16.559666 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.77979234 +0000 UTC m=+0.942851408,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.566917 master-0 kubenswrapper[4055]: E0313 01:11:16.566777 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.779823061 +0000 UTC m=+0.942882139,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.573297 master-0 kubenswrapper[4055]: E0313 01:11:16.573094 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.779838652 +0000 UTC m=+0.942897731,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.580544 master-0 kubenswrapper[4055]: E0313 01:11:16.580397 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.781227344 +0000 UTC m=+0.944286413,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.587114 master-0 kubenswrapper[4055]: E0313 01:11:16.586978 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.781255308 +0000 UTC m=+0.944314376,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.594004 master-0 kubenswrapper[4055]: E0313 01:11:16.593880 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.781273185 +0000 UTC m=+0.944332264,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.600787 master-0 kubenswrapper[4055]: E0313 01:11:16.600664 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.782141666 +0000 UTC m=+0.945200734,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.607130 master-0 kubenswrapper[4055]: E0313 01:11:16.607007 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.782172417 +0000 UTC m=+0.945231485,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.613748 master-0 kubenswrapper[4055]: E0313 01:11:16.613484 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.78218883 +0000 UTC m=+0.945247908,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.620547 master-0 kubenswrapper[4055]: E0313 01:11:16.620426 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.782739412 +0000 UTC m=+0.945798490,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.626297 master-0 kubenswrapper[4055]: E0313 01:11:16.626101 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.78276507 +0000 UTC m=+0.945824138,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.633443 master-0 kubenswrapper[4055]: E0313 01:11:16.633246 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.78278052 +0000 UTC m=+0.945839588,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.640251 master-0 kubenswrapper[4055]: E0313 01:11:16.640087 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.782873445 +0000 UTC m=+0.945932493,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.647078 master-0 kubenswrapper[4055]: E0313 01:11:16.646929 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b075f2ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b075f2ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580122286 +0000 UTC m=+0.743181354,LastTimestamp:2026-03-13 01:11:02.782889648 +0000 UTC m=+0.945948696,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.654381 master-0 kubenswrapper[4055]: E0313 01:11:16.654212 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b076351b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b076351b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580139291 +0000 UTC m=+0.743198359,LastTimestamp:2026-03-13 01:11:02.782905179 +0000 UTC m=+0.945964227,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.661376 master-0 kubenswrapper[4055]: E0313 01:11:16.661208 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.783790113 +0000 UTC m=+0.946849161,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.668278 master-0 kubenswrapper[4055]: E0313 01:11:16.668128 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c4166b0756587\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c4166b0756587 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:02.580086151 +0000 UTC m=+0.743145229,LastTimestamp:2026-03-13 01:11:02.783797041 +0000 UTC m=+0.946856089,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.677887 master-0 kubenswrapper[4055]: E0313 01:11:16.677728 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c4166f6884f08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:03.755730696 +0000 UTC m=+1.918789774,LastTimestamp:2026-03-13 01:11:03.755730696 +0000 UTC m=+1.918789774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.685680 master-0 kubenswrapper[4055]: E0313 01:11:16.685461 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c4166f8213fcd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:03.782531021 +0000 UTC m=+1.945590099,LastTimestamp:2026-03-13 01:11:03.782531021 +0000 UTC m=+1.945590099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.693116 master-0 kubenswrapper[4055]: E0313 01:11:16.693033 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4166f8c82544 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:03.79346874 +0000 UTC m=+1.956527818,LastTimestamp:2026-03-13 01:11:03.79346874 +0000 UTC m=+1.956527818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.699948 master-0 kubenswrapper[4055]: E0313 01:11:16.699754 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4166fb9550a8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:03.84046916 +0000 UTC m=+2.003528238,LastTimestamp:2026-03-13 01:11:03.84046916 +0000 UTC m=+2.003528238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.706535 master-0 kubenswrapper[4055]: E0313 01:11:16.706370 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4166fcd47d4f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:03.861386575 +0000 UTC m=+2.024445653,LastTimestamp:2026-03-13 01:11:03.861386575 +0000 UTC m=+2.024445653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.714096 master-0 kubenswrapper[4055]: E0313 01:11:16.713922 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c41678d2a9597 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 2.442s (2.442s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.282947991 +0000 UTC m=+4.446007059,LastTimestamp:2026-03-13 01:11:06.282947991 +0000 UTC m=+4.446007059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.720859 master-0 kubenswrapper[4055]: E0313 01:11:16.720736 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c41678dcf701d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 2.537s (2.538s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.293751837 +0000 UTC m=+4.456810915,LastTimestamp:2026-03-13 01:11:06.293751837 +0000 UTC m=+4.456810915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.727533 master-0 kubenswrapper[4055]: E0313 01:11:16.727431 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c416798498430 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.469524528 +0000 UTC m=+4.632583586,LastTimestamp:2026-03-13 01:11:06.469524528 +0000 UTC m=+4.632583586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.734815 master-0 kubenswrapper[4055]: E0313 01:11:16.734601 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c4167987fdcad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.473086125 +0000 UTC m=+4.636145163,LastTimestamp:2026-03-13 01:11:06.473086125 +0000 UTC m=+4.636145163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.741961 master-0 kubenswrapper[4055]: E0313 01:11:16.741778 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c416799379aef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.485127919 +0000 UTC m=+4.648186967,LastTimestamp:2026-03-13 01:11:06.485127919 +0000 UTC m=+4.648186967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.747247 master-0 kubenswrapper[4055]: I0313 01:11:16.747120 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60"} Mar 13 01:11:16.747247 master-0 kubenswrapper[4055]: I0313 01:11:16.747191 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:16.747247 master-0 kubenswrapper[4055]: I0313 01:11:16.747149 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:16.748386 master-0 kubenswrapper[4055]: I0313 01:11:16.748289 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:16.748386 master-0 kubenswrapper[4055]: I0313 01:11:16.748328 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:16.748386 master-0 kubenswrapper[4055]: I0313 01:11:16.748380 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:16.748832 master-0 kubenswrapper[4055]: I0313 01:11:16.748768 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:16.748832 master-0 kubenswrapper[4055]: I0313 01:11:16.748825 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:16.749038 master-0 kubenswrapper[4055]: I0313 01:11:16.748848 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:16.749855 master-0 kubenswrapper[4055]: E0313 01:11:16.749707 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c41679990ef64 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.490982244 +0000 UTC m=+4.654041282,LastTimestamp:2026-03-13 01:11:06.490982244 +0000 UTC m=+4.654041282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.756954 master-0 kubenswrapper[4055]: E0313 01:11:16.756726 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c4167a58563f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.691552243 +0000 UTC m=+4.854611281,LastTimestamp:2026-03-13 01:11:06.691552243 +0000 UTC m=+4.854611281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.764303 master-0 kubenswrapper[4055]: E0313 01:11:16.764183 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c4167a68accc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:06.708683977 +0000 UTC m=+4.871743015,LastTimestamp:2026-03-13 01:11:06.708683977 +0000 UTC m=+4.871743015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.771253 master-0 kubenswrapper[4055]: E0313 01:11:16.771086 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4167bc7f8cb9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:07.077045433 +0000 UTC m=+5.240104511,LastTimestamp:2026-03-13 01:11:07.077045433 +0000 UTC m=+5.240104511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.778880 master-0 kubenswrapper[4055]: E0313 01:11:16.778747 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4167e213bdcc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:07.707514316 +0000 UTC m=+5.870573354,LastTimestamp:2026-03-13 01:11:07.707514316 +0000 UTC m=+5.870573354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.785785 master-0 kubenswrapper[4055]: E0313 01:11:16.785559 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c416805c0b3a7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.306052007 +0000 UTC m=+6.469111045,LastTimestamp:2026-03-13 01:11:08.306052007 +0000 UTC m=+6.469111045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.792727 master-0 kubenswrapper[4055]: E0313 01:11:16.792561 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c41680e41b0bc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.448723132 +0000 UTC m=+6.611782180,LastTimestamp:2026-03-13 01:11:08.448723132 +0000 UTC m=+6.611782180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.799873 master-0 kubenswrapper[4055]: E0313 01:11:16.799736 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c4167e213bdcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4167e213bdcc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:07.707514316 +0000 UTC m=+5.870573354,LastTimestamp:2026-03-13 01:11:08.70926289 +0000 UTC m=+6.872321968,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.806671 master-0 kubenswrapper[4055]: E0313 01:11:16.806511 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c4168c1e60851 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.68s (7.68s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.462615121 +0000 UTC m=+9.625674169,LastTimestamp:2026-03-13 01:11:11.462615121 +0000 UTC m=+9.625674169,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.814091 master-0 kubenswrapper[4055]: E0313 01:11:16.813960 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4168c2ebcefb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.686s (7.686s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.479770875 +0000 UTC m=+9.642829933,LastTimestamp:2026-03-13 01:11:11.479770875 +0000 UTC m=+9.642829933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.821081 master-0 kubenswrapper[4055]: E0313 01:11:16.820949 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168c3bd1005 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 7.632s (7.632s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.493484549 +0000 UTC m=+9.656543597,LastTimestamp:2026-03-13 01:11:11.493484549 +0000 UTC m=+9.656543597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.827673 master-0 kubenswrapper[4055]: E0313 01:11:16.827520 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c416805c0b3a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c416805c0b3a7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.306052007 +0000 UTC m=+6.469111045,LastTimestamp:2026-03-13 01:11:11.598396873 +0000 UTC m=+9.761455941,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.834298 master-0 kubenswrapper[4055]: E0313 01:11:16.834172 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c41680e41b0bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c41680e41b0bc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.448723132 +0000 UTC m=+6.611782180,LastTimestamp:2026-03-13 01:11:11.613832812 +0000 UTC m=+9.776891850,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.840877 master-0 kubenswrapper[4055]: E0313 01:11:16.840755 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c4168ce5d1ebb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.671746235 +0000 UTC m=+9.834805273,LastTimestamp:2026-03-13 01:11:11.671746235 +0000 UTC m=+9.834805273,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.847560 master-0 kubenswrapper[4055]: E0313 01:11:16.847383 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c4168cefb0fa4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.68209706 +0000 UTC m=+9.845156098,LastTimestamp:2026-03-13 01:11:11.68209706 +0000 UTC m=+9.845156098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.854423 master-0 kubenswrapper[4055]: E0313 01:11:16.854312 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4168d0fd6b08 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.71580596 +0000 UTC m=+9.878865038,LastTimestamp:2026-03-13 01:11:11.71580596 +0000 UTC m=+9.878865038,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.861396 master-0 kubenswrapper[4055]: E0313 01:11:16.861265 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168d4ebb3ae kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.781753774 +0000 UTC m=+9.944812812,LastTimestamp:2026-03-13 01:11:11.781753774 +0000 UTC m=+9.944812812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.868238 master-0 kubenswrapper[4055]: E0313 01:11:16.868029 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4168d4f657ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.782451116 +0000 UTC m=+9.945510154,LastTimestamp:2026-03-13 01:11:11.782451116 +0000 UTC m=+9.945510154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.874868 master-0 kubenswrapper[4055]: E0313 01:11:16.874728 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168d5994bfc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.793130492 +0000 UTC m=+9.956189530,LastTimestamp:2026-03-13 01:11:11.793130492 +0000 UTC m=+9.956189530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.881189 master-0 kubenswrapper[4055]: E0313 01:11:16.881064 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168d5a49e16 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.793872406 +0000 UTC m=+9.956931444,LastTimestamp:2026-03-13 01:11:11.793872406 +0000 UTC m=+9.956931444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.887667 master-0 kubenswrapper[4055]: E0313 01:11:16.887513 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4168d5e05290 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.797785232 +0000 UTC m=+9.960844270,LastTimestamp:2026-03-13 01:11:11.797785232 +0000 UTC m=+9.960844270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.895017 master-0 kubenswrapper[4055]: E0313 01:11:16.894903 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c41690d458e51 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:12.727166545 +0000 UTC m=+10.890225573,LastTimestamp:2026-03-13 01:11:12.727166545 +0000 UTC m=+10.890225573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.902758 master-0 kubenswrapper[4055]: E0313 01:11:16.902614 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c416916b9d40b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:12.885781515 +0000 UTC m=+11.048840553,LastTimestamp:2026-03-13 01:11:12.885781515 +0000 UTC m=+11.048840553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.909821 master-0 kubenswrapper[4055]: E0313 01:11:16.909596 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c41691788e3b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:12.899351474 +0000 UTC m=+11.062410532,LastTimestamp:2026-03-13 01:11:12.899351474 +0000 UTC m=+11.062410532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.916816 master-0 kubenswrapper[4055]: E0313 01:11:16.916623 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c41691793a34f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:12.900055887 +0000 UTC m=+11.063114925,LastTimestamp:2026-03-13 01:11:12.900055887 +0000 UTC m=+11.063114925,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.923576 master-0 kubenswrapper[4055]: E0313 01:11:16.923411 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c41694e9eab5f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 2.029s (2.029s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:13.823525727 +0000 UTC m=+11.986584765,LastTimestamp:2026-03-13 01:11:13.823525727 +0000 UTC m=+11.986584765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.930564 master-0 kubenswrapper[4055]: E0313 01:11:16.930458 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c41695b31d720 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:14.034497312 +0000 UTC m=+12.197556400,LastTimestamp:2026-03-13 01:11:14.034497312 +0000 UTC m=+12.197556400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.937406 master-0 kubenswrapper[4055]: E0313 01:11:16.937272 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c41695bf99871 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:14.047588465 +0000 UTC m=+12.210647523,LastTimestamp:2026-03-13 01:11:14.047588465 +0000 UTC m=+12.210647523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.945189 master-0 kubenswrapper[4055]: E0313 01:11:16.945035 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c41698552a54a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:14.741290314 +0000 UTC m=+12.904349372,LastTimestamp:2026-03-13 01:11:14.741290314 +0000 UTC m=+12.904349372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.952250 master-0 kubenswrapper[4055]: E0313 01:11:16.952043 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189c4168d4ebb3ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168d4ebb3ae kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.781753774 +0000 UTC m=+9.944812812,LastTimestamp:2026-03-13 01:11:14.980157341 +0000 UTC m=+13.143216379,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.959404 master-0 kubenswrapper[4055]: E0313 01:11:16.959274 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189c4168d5994bfc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c4168d5994bfc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.793130492 +0000 UTC m=+9.956189530,LastTimestamp:2026-03-13 01:11:14.99374389 +0000 UTC m=+13.156802928,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.966715 master-0 kubenswrapper[4055]: E0313 01:11:16.966569 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4169c3574c03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 2.881s (2.881s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:15.781782531 +0000 UTC m=+13.944841599,LastTimestamp:2026-03-13 01:11:15.781782531 +0000 UTC m=+13.944841599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.974356 master-0 kubenswrapper[4055]: E0313 01:11:16.974254 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4169d134080a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:16.014352394 +0000 UTC m=+14.177411472,LastTimestamp:2026-03-13 01:11:16.014352394 +0000 UTC m=+14.177411472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:16.981204 master-0 kubenswrapper[4055]: E0313 01:11:16.981037 4055 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c4169d1f92c13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:16.027272211 +0000 UTC m=+14.190331279,LastTimestamp:2026-03-13 01:11:16.027272211 +0000 UTC m=+14.190331279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:17.022236 master-0 kubenswrapper[4055]: I0313 01:11:17.022137 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:17.035594 master-0 kubenswrapper[4055]: I0313 01:11:17.035490 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:17.281804 master-0 kubenswrapper[4055]: I0313 01:11:17.281586 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 01:11:17.305000 master-0 kubenswrapper[4055]: I0313 01:11:17.304929 4055 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 01:11:17.523425 master-0 kubenswrapper[4055]: I0313 01:11:17.523350 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:17.750210 master-0 kubenswrapper[4055]: I0313 01:11:17.750157 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:17.750869 master-0 kubenswrapper[4055]: I0313 01:11:17.750379 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:17.751320 master-0 kubenswrapper[4055]: I0313 01:11:17.751276 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:17.751363 master-0 kubenswrapper[4055]: I0313 01:11:17.751339 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:17.751363 master-0 kubenswrapper[4055]: I0313 01:11:17.751358 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:17.756451 master-0 kubenswrapper[4055]: I0313 01:11:17.756416 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:11:18.522718 master-0 kubenswrapper[4055]: I0313 01:11:18.522627 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:18.752124 master-0 kubenswrapper[4055]: I0313 01:11:18.752058 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:18.753662 master-0 kubenswrapper[4055]: I0313 01:11:18.753586 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:18.753788 master-0 kubenswrapper[4055]: I0313 01:11:18.753669 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:18.753788 master-0 kubenswrapper[4055]: I0313 01:11:18.753688 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:19.519660 master-0 kubenswrapper[4055]: I0313 01:11:19.519588 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:19.519840 master-0 kubenswrapper[4055]: W0313 01:11:19.519724 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 01:11:19.519840 master-0 kubenswrapper[4055]: E0313 01:11:19.519767 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 01:11:19.755429 master-0 kubenswrapper[4055]: I0313 01:11:19.755359 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:19.756601 master-0 kubenswrapper[4055]: I0313 01:11:19.756531 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:19.756601 master-0 kubenswrapper[4055]: I0313 01:11:19.756599 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:19.756875 master-0 kubenswrapper[4055]: I0313 01:11:19.756623 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:20.443732 master-0 kubenswrapper[4055]: I0313 01:11:20.443559 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:20.443923 master-0 kubenswrapper[4055]: I0313 01:11:20.443828 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:20.445480 master-0 kubenswrapper[4055]: I0313 01:11:20.445434 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:20.445568 master-0 kubenswrapper[4055]: I0313 01:11:20.445514 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:20.445568 master-0 kubenswrapper[4055]: I0313 01:11:20.445533 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:20.509786 master-0 kubenswrapper[4055]: I0313 01:11:20.509737 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:20.517298 master-0 kubenswrapper[4055]: I0313 01:11:20.517261 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:20.643707 master-0 kubenswrapper[4055]: W0313 01:11:20.643620 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:20.643884 master-0 kubenswrapper[4055]: E0313 01:11:20.643718 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 01:11:20.758361 master-0 kubenswrapper[4055]: I0313 01:11:20.758232 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:20.759674 master-0 kubenswrapper[4055]: I0313 01:11:20.759589 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:20.759788 master-0 kubenswrapper[4055]: I0313 01:11:20.759679 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:20.759788 master-0 kubenswrapper[4055]: I0313 01:11:20.759697 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:21.189971 master-0 kubenswrapper[4055]: I0313 01:11:21.189855 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:21.198301 master-0 kubenswrapper[4055]: I0313 01:11:21.198231 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:21.522076 master-0 kubenswrapper[4055]: I0313 01:11:21.522010 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:21.738223 master-0 kubenswrapper[4055]: I0313 01:11:21.738148 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:21.747734 master-0 kubenswrapper[4055]: I0313 01:11:21.747607 4055 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:21.760507 master-0 kubenswrapper[4055]: I0313 01:11:21.760443 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:21.761500 master-0 kubenswrapper[4055]: I0313 01:11:21.761441 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:21.761500 master-0 kubenswrapper[4055]: I0313 01:11:21.761498 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:21.761768 master-0 kubenswrapper[4055]: I0313 01:11:21.761515 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:22.145559 master-0 kubenswrapper[4055]: E0313 01:11:22.145394 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 01:11:22.298482 master-0 kubenswrapper[4055]: I0313 01:11:22.298426 4055 csr.go:261] certificate signing request csr-fnb9j is approved, waiting to be issued Mar 13 01:11:22.409853 master-0 kubenswrapper[4055]: I0313 01:11:22.409713 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:22.411003 master-0 kubenswrapper[4055]: I0313 01:11:22.410954 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:22.411003 master-0 kubenswrapper[4055]: I0313 01:11:22.410987 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:22.411003 master-0 kubenswrapper[4055]: I0313 01:11:22.410998 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:22.411312 master-0 kubenswrapper[4055]: I0313 01:11:22.411068 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:22.417464 master-0 kubenswrapper[4055]: E0313 01:11:22.417395 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 01:11:22.521746 master-0 kubenswrapper[4055]: I0313 01:11:22.521692 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:22.668130 master-0 kubenswrapper[4055]: E0313 01:11:22.668066 4055 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 01:11:22.767591 master-0 kubenswrapper[4055]: I0313 01:11:22.763420 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:22.768495 master-0 kubenswrapper[4055]: I0313 01:11:22.768352 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:22.768495 master-0 kubenswrapper[4055]: I0313 01:11:22.768419 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:22.768495 master-0 kubenswrapper[4055]: I0313 01:11:22.768437 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:23.328529 master-0 kubenswrapper[4055]: W0313 01:11:23.328450 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 01:11:23.328529 master-0 kubenswrapper[4055]: E0313 01:11:23.328520 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 01:11:23.522239 master-0 kubenswrapper[4055]: I0313 01:11:23.522143 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:24.031757 master-0 kubenswrapper[4055]: W0313 01:11:24.031590 4055 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 01:11:24.031757 master-0 kubenswrapper[4055]: E0313 01:11:24.031739 4055 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 01:11:24.522787 master-0 kubenswrapper[4055]: I0313 01:11:24.522700 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:25.522836 master-0 kubenswrapper[4055]: I0313 01:11:25.522748 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:26.523101 master-0 kubenswrapper[4055]: I0313 01:11:26.522963 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:26.679626 master-0 kubenswrapper[4055]: I0313 01:11:26.679517 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:26.681172 master-0 kubenswrapper[4055]: I0313 01:11:26.681100 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:26.681320 master-0 kubenswrapper[4055]: I0313 01:11:26.681179 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:26.681320 master-0 kubenswrapper[4055]: I0313 01:11:26.681197 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:26.681901 master-0 kubenswrapper[4055]: I0313 01:11:26.681837 4055 scope.go:117] "RemoveContainer" containerID="7269f2b718cb983fc5bf661b55ffd012dccc71af3ba230b7c7a56b051c9d7328" Mar 13 01:11:26.694357 master-0 kubenswrapper[4055]: E0313 01:11:26.694190 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c4167e213bdcc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4167e213bdcc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:07.707514316 +0000 UTC m=+5.870573354,LastTimestamp:2026-03-13 01:11:26.686498215 +0000 UTC m=+24.849557283,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:27.010077 master-0 kubenswrapper[4055]: E0313 01:11:27.009445 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c416805c0b3a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c416805c0b3a7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.306052007 +0000 UTC m=+6.469111045,LastTimestamp:2026-03-13 01:11:26.999137095 +0000 UTC m=+25.162196173,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:27.026975 master-0 kubenswrapper[4055]: E0313 01:11:27.026748 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c41680e41b0bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c41680e41b0bc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:08.448723132 +0000 UTC m=+6.611782180,LastTimestamp:2026-03-13 01:11:27.016981204 +0000 UTC m=+25.180040282,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:27.523922 master-0 kubenswrapper[4055]: I0313 01:11:27.523780 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:27.780448 master-0 kubenswrapper[4055]: I0313 01:11:27.780209 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 01:11:27.781180 master-0 kubenswrapper[4055]: I0313 01:11:27.781121 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 01:11:27.781857 master-0 kubenswrapper[4055]: I0313 01:11:27.781789 4055 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" exitCode=1 Mar 13 01:11:27.781965 master-0 kubenswrapper[4055]: I0313 01:11:27.781857 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5"} Mar 13 01:11:27.781965 master-0 kubenswrapper[4055]: I0313 01:11:27.781939 4055 scope.go:117] "RemoveContainer" containerID="7269f2b718cb983fc5bf661b55ffd012dccc71af3ba230b7c7a56b051c9d7328" Mar 13 01:11:27.782110 master-0 kubenswrapper[4055]: I0313 01:11:27.782044 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:27.783070 master-0 kubenswrapper[4055]: I0313 01:11:27.783028 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:27.783070 master-0 kubenswrapper[4055]: I0313 01:11:27.783065 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:27.783239 master-0 kubenswrapper[4055]: I0313 01:11:27.783080 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:27.783523 master-0 kubenswrapper[4055]: I0313 01:11:27.783483 4055 scope.go:117] "RemoveContainer" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" Mar 13 01:11:27.783723 master-0 kubenswrapper[4055]: E0313 01:11:27.783696 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 01:11:27.791247 master-0 kubenswrapper[4055]: E0313 01:11:27.791103 4055 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c4168d0fd6b08\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c4168d0fd6b08 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:11:11.71580596 +0000 UTC m=+9.878865038,LastTimestamp:2026-03-13 01:11:27.783665153 +0000 UTC m=+25.946724181,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:11:28.521627 master-0 kubenswrapper[4055]: I0313 01:11:28.521527 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:28.785935 master-0 kubenswrapper[4055]: I0313 01:11:28.785781 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 01:11:29.160368 master-0 kubenswrapper[4055]: E0313 01:11:29.160056 4055 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 01:11:29.418119 master-0 kubenswrapper[4055]: I0313 01:11:29.418029 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:29.419666 master-0 kubenswrapper[4055]: I0313 01:11:29.419594 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:29.419784 master-0 kubenswrapper[4055]: I0313 01:11:29.419763 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:29.419857 master-0 kubenswrapper[4055]: I0313 01:11:29.419785 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:29.419857 master-0 kubenswrapper[4055]: I0313 01:11:29.419836 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:29.427746 master-0 kubenswrapper[4055]: E0313 01:11:29.427686 4055 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 01:11:29.539966 master-0 kubenswrapper[4055]: I0313 01:11:29.539840 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:30.450431 master-0 kubenswrapper[4055]: I0313 01:11:30.450359 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:30.452153 master-0 kubenswrapper[4055]: I0313 01:11:30.450539 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:30.452153 master-0 kubenswrapper[4055]: I0313 01:11:30.451875 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:30.452153 master-0 kubenswrapper[4055]: I0313 01:11:30.451905 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:30.452153 master-0 kubenswrapper[4055]: I0313 01:11:30.451921 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:30.459945 master-0 kubenswrapper[4055]: I0313 01:11:30.459908 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:11:30.522548 master-0 kubenswrapper[4055]: I0313 01:11:30.522481 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:30.791450 master-0 kubenswrapper[4055]: I0313 01:11:30.791366 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:30.792852 master-0 kubenswrapper[4055]: I0313 01:11:30.792803 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:30.792959 master-0 kubenswrapper[4055]: I0313 01:11:30.792856 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:30.792959 master-0 kubenswrapper[4055]: I0313 01:11:30.792873 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:31.522159 master-0 kubenswrapper[4055]: I0313 01:11:31.522076 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:32.521704 master-0 kubenswrapper[4055]: I0313 01:11:32.521668 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:32.669214 master-0 kubenswrapper[4055]: E0313 01:11:32.669132 4055 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 01:11:33.525072 master-0 kubenswrapper[4055]: I0313 01:11:33.524986 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:34.524061 master-0 kubenswrapper[4055]: I0313 01:11:34.523953 4055 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 01:11:34.905152 master-0 kubenswrapper[4055]: I0313 01:11:34.905055 4055 csr.go:257] certificate signing request csr-fnb9j is issued Mar 13 01:11:35.391710 master-0 kubenswrapper[4055]: I0313 01:11:35.391538 4055 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 01:11:35.524516 master-0 kubenswrapper[4055]: I0313 01:11:35.524476 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.539392 master-0 kubenswrapper[4055]: I0313 01:11:35.539344 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.599075 master-0 kubenswrapper[4055]: I0313 01:11:35.598993 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.876229 master-0 kubenswrapper[4055]: I0313 01:11:35.876159 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.876229 master-0 kubenswrapper[4055]: E0313 01:11:35.876212 4055 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 01:11:35.897618 master-0 kubenswrapper[4055]: I0313 01:11:35.897509 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.906760 master-0 kubenswrapper[4055]: I0313 01:11:35.906688 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 19:07:46.105985679 +0000 UTC Mar 13 01:11:35.906760 master-0 kubenswrapper[4055]: I0313 01:11:35.906737 4055 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h56m10.199251986s for next certificate rotation Mar 13 01:11:35.913586 master-0 kubenswrapper[4055]: I0313 01:11:35.913539 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:35.971199 master-0 kubenswrapper[4055]: I0313 01:11:35.971133 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:36.206231 master-0 kubenswrapper[4055]: E0313 01:11:36.206167 4055 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 13 01:11:36.277296 master-0 kubenswrapper[4055]: I0313 01:11:36.277228 4055 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 01:11:36.277296 master-0 kubenswrapper[4055]: E0313 01:11:36.277277 4055 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 01:11:36.428146 master-0 kubenswrapper[4055]: I0313 01:11:36.428069 4055 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:11:36.429301 master-0 kubenswrapper[4055]: I0313 01:11:36.429261 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:11:36.429488 master-0 kubenswrapper[4055]: I0313 01:11:36.429465 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:11:36.429660 master-0 kubenswrapper[4055]: I0313 01:11:36.429611 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:11:36.429854 master-0 kubenswrapper[4055]: I0313 01:11:36.429831 4055 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:11:36.479044 master-0 kubenswrapper[4055]: I0313 01:11:36.478841 4055 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 01:11:36.479044 master-0 kubenswrapper[4055]: E0313 01:11:36.478903 4055 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 13 01:11:36.561684 master-0 kubenswrapper[4055]: I0313 01:11:36.561544 4055 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 01:11:37.536829 master-0 kubenswrapper[4055]: I0313 01:11:37.536759 4055 apiserver.go:52] "Watching apiserver" Mar 13 01:11:37.539447 master-0 kubenswrapper[4055]: I0313 01:11:37.539376 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 01:11:37.539447 master-0 kubenswrapper[4055]: I0313 01:11:37.539435 4055 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 01:11:37.539675 master-0 kubenswrapper[4055]: I0313 01:11:37.539508 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=[] Mar 13 01:11:37.619141 master-0 kubenswrapper[4055]: I0313 01:11:37.619064 4055 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 01:11:37.960666 master-0 kubenswrapper[4055]: I0313 01:11:37.960569 4055 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 01:11:38.010035 master-0 kubenswrapper[4055]: I0313 01:11:38.009962 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt"] Mar 13 01:11:38.010375 master-0 kubenswrapper[4055]: I0313 01:11:38.010225 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.015275 master-0 kubenswrapper[4055]: I0313 01:11:38.015222 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:11:38.016839 master-0 kubenswrapper[4055]: I0313 01:11:38.016800 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:11:38.020288 master-0 kubenswrapper[4055]: I0313 01:11:38.020250 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:11:38.075958 master-0 kubenswrapper[4055]: I0313 01:11:38.075909 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7c649bf6d4-bdc4j"] Mar 13 01:11:38.076303 master-0 kubenswrapper[4055]: I0313 01:11:38.076120 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.079338 master-0 kubenswrapper[4055]: I0313 01:11:38.079308 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 01:11:38.079590 master-0 kubenswrapper[4055]: I0313 01:11:38.079561 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 01:11:38.080039 master-0 kubenswrapper[4055]: I0313 01:11:38.080005 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 01:11:38.106909 master-0 kubenswrapper[4055]: I0313 01:11:38.106710 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.106909 master-0 kubenswrapper[4055]: I0313 01:11:38.106768 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.106909 master-0 kubenswrapper[4055]: I0313 01:11:38.106794 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.106909 master-0 kubenswrapper[4055]: I0313 01:11:38.106821 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.106909 master-0 kubenswrapper[4055]: I0313 01:11:38.106842 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.207711 master-0 kubenswrapper[4055]: I0313 01:11:38.207663 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.207711 master-0 kubenswrapper[4055]: I0313 01:11:38.207719 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.207936 master-0 kubenswrapper[4055]: I0313 01:11:38.207745 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.207936 master-0 kubenswrapper[4055]: I0313 01:11:38.207828 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.207936 master-0 kubenswrapper[4055]: I0313 01:11:38.207901 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.208023 master-0 kubenswrapper[4055]: I0313 01:11:38.207946 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.208055 master-0 kubenswrapper[4055]: I0313 01:11:38.208021 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.208091 master-0 kubenswrapper[4055]: E0313 01:11:38.208050 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:38.208091 master-0 kubenswrapper[4055]: I0313 01:11:38.208055 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.208146 master-0 kubenswrapper[4055]: I0313 01:11:38.208121 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.208202 master-0 kubenswrapper[4055]: E0313 01:11:38.208180 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:11:38.708091554 +0000 UTC m=+36.871150602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:38.208245 master-0 kubenswrapper[4055]: I0313 01:11:38.208213 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.209069 master-0 kubenswrapper[4055]: I0313 01:11:38.209038 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.221385 master-0 kubenswrapper[4055]: I0313 01:11:38.221321 4055 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 01:11:38.228048 master-0 kubenswrapper[4055]: I0313 01:11:38.227993 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.309349 master-0 kubenswrapper[4055]: I0313 01:11:38.309251 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.309349 master-0 kubenswrapper[4055]: I0313 01:11:38.309328 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.309710 master-0 kubenswrapper[4055]: I0313 01:11:38.309612 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.309791 master-0 kubenswrapper[4055]: I0313 01:11:38.309618 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.313755 master-0 kubenswrapper[4055]: I0313 01:11:38.313687 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.339448 master-0 kubenswrapper[4055]: I0313 01:11:38.339342 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.388601 master-0 kubenswrapper[4055]: I0313 01:11:38.388461 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:11:38.405593 master-0 kubenswrapper[4055]: W0313 01:11:38.405535 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc49699_9428_4bff_804d_da0e60551759.slice/crio-89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d WatchSource:0}: Error finding container 89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d: Status 404 returned error can't find the container with id 89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d Mar 13 01:11:38.663774 master-0 kubenswrapper[4055]: I0313 01:11:38.663732 4055 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 01:11:38.711927 master-0 kubenswrapper[4055]: I0313 01:11:38.711858 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:38.712160 master-0 kubenswrapper[4055]: E0313 01:11:38.712031 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:38.712160 master-0 kubenswrapper[4055]: E0313 01:11:38.712143 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:11:39.712113271 +0000 UTC m=+37.875172349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:38.811234 master-0 kubenswrapper[4055]: I0313 01:11:38.811146 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerStarted","Data":"89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d"} Mar 13 01:11:39.570617 master-0 kubenswrapper[4055]: I0313 01:11:39.570579 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-qpxft"] Mar 13 01:11:39.570955 master-0 kubenswrapper[4055]: I0313 01:11:39.570936 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.577350 master-0 kubenswrapper[4055]: I0313 01:11:39.577296 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 13 01:11:39.577559 master-0 kubenswrapper[4055]: I0313 01:11:39.577371 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 13 01:11:39.577559 master-0 kubenswrapper[4055]: I0313 01:11:39.577312 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 13 01:11:39.577559 master-0 kubenswrapper[4055]: I0313 01:11:39.577430 4055 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 13 01:11:39.719590 master-0 kubenswrapper[4055]: I0313 01:11:39.719529 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.719590 master-0 kubenswrapper[4055]: I0313 01:11:39.719588 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.720155 master-0 kubenswrapper[4055]: I0313 01:11:39.719676 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.720155 master-0 kubenswrapper[4055]: I0313 01:11:39.719702 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbwdr\" (UniqueName: \"kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.720155 master-0 kubenswrapper[4055]: I0313 01:11:39.719838 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:39.721256 master-0 kubenswrapper[4055]: I0313 01:11:39.721180 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.721319 master-0 kubenswrapper[4055]: E0313 01:11:39.721197 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:39.721403 master-0 kubenswrapper[4055]: E0313 01:11:39.721377 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:11:41.721348867 +0000 UTC m=+39.884407935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:39.821847 master-0 kubenswrapper[4055]: I0313 01:11:39.821702 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.821847 master-0 kubenswrapper[4055]: I0313 01:11:39.821765 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.821847 master-0 kubenswrapper[4055]: I0313 01:11:39.821801 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822090 master-0 kubenswrapper[4055]: I0313 01:11:39.821878 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822090 master-0 kubenswrapper[4055]: I0313 01:11:39.821936 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822244 master-0 kubenswrapper[4055]: I0313 01:11:39.822151 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822291 master-0 kubenswrapper[4055]: I0313 01:11:39.822239 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbwdr\" (UniqueName: \"kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822365 master-0 kubenswrapper[4055]: I0313 01:11:39.822189 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.822473 master-0 kubenswrapper[4055]: I0313 01:11:39.822427 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.854907 master-0 kubenswrapper[4055]: I0313 01:11:39.854852 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbwdr\" (UniqueName: \"kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr\") pod \"assisted-installer-controller-qpxft\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:39.905512 master-0 kubenswrapper[4055]: I0313 01:11:39.905457 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:40.817946 master-0 kubenswrapper[4055]: I0313 01:11:40.817887 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-qpxft" event={"ID":"abf2ead5-b97d-4160-8120-28cb8a3d843e","Type":"ContainerStarted","Data":"9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb"} Mar 13 01:11:41.735460 master-0 kubenswrapper[4055]: I0313 01:11:41.735411 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:41.735736 master-0 kubenswrapper[4055]: E0313 01:11:41.735657 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:41.735818 master-0 kubenswrapper[4055]: E0313 01:11:41.735798 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:11:45.735768807 +0000 UTC m=+43.898827835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: E0313 01:11:41.807270 4055 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: set -o allexport Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: source /etc/kubernetes/apiserver-url.env Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: else Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: exit 1 Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: fi Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdzjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-bdc4j_openshift-network-operator(bfc49699-9428-4bff-804d-da0e60551759): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 13 01:11:41.807337 master-0 kubenswrapper[4055]: > logger="UnhandledError" Mar 13 01:11:41.809109 master-0 kubenswrapper[4055]: E0313 01:11:41.809038 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" podUID="bfc49699-9428-4bff-804d-da0e60551759" Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: E0313 01:11:41.824249 4055 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,Command:[/bin/bash -c #!/bin/bash Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: set -o allexport Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: source /etc/kubernetes/apiserver-url.env Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: else Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: exit 1 Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: fi Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9242604e78efada5aeb232d73a7963f806b754213f5d92b1dffc9b493d7b5a65,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b19b9d0e5437b0bb19cafc3fb516f654c911cdf11184c0de9a27b43c6b80c9ce,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3aa7c84e73a2a19cc9baca38b7e86dfcde579aa88221647c332c83f047d5ae6d,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bfe4d3125d98cc501d5a529d3ae2497106a2bbb5a6dd06df7c0e0930d168212,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b62afe74fdcb011a4a8c8fa5572dbab2514dda673ae4be4c6beaef92d28216ba,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zdzjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-7c649bf6d4-bdc4j_openshift-network-operator(bfc49699-9428-4bff-804d-da0e60551759): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 13 01:11:41.824368 master-0 kubenswrapper[4055]: > logger="UnhandledError" Mar 13 01:11:41.825661 master-0 kubenswrapper[4055]: E0313 01:11:41.825564 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" podUID="bfc49699-9428-4bff-804d-da0e60551759" Mar 13 01:11:42.624786 master-0 kubenswrapper[4055]: I0313 01:11:42.624701 4055 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 01:11:42.695282 master-0 kubenswrapper[4055]: I0313 01:11:42.695248 4055 scope.go:117] "RemoveContainer" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" Mar 13 01:11:42.695459 master-0 kubenswrapper[4055]: I0313 01:11:42.695325 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 13 01:11:42.695510 master-0 kubenswrapper[4055]: E0313 01:11:42.695485 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 01:11:42.825074 master-0 kubenswrapper[4055]: I0313 01:11:42.825020 4055 scope.go:117] "RemoveContainer" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" Mar 13 01:11:42.825389 master-0 kubenswrapper[4055]: E0313 01:11:42.825340 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 01:11:43.074155 master-0 kubenswrapper[4055]: I0313 01:11:43.074095 4055 csr.go:261] certificate signing request csr-2th4k is approved, waiting to be issued Mar 13 01:11:43.079268 master-0 kubenswrapper[4055]: I0313 01:11:43.079231 4055 csr.go:257] certificate signing request csr-2th4k is issued Mar 13 01:11:43.538832 master-0 kubenswrapper[4055]: I0313 01:11:43.538755 4055 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 01:11:44.080951 master-0 kubenswrapper[4055]: I0313 01:11:44.080661 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 20:47:36.732559734 +0000 UTC Mar 13 01:11:44.080951 master-0 kubenswrapper[4055]: I0313 01:11:44.080939 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h35m52.651625041s for next certificate rotation Mar 13 01:11:45.081696 master-0 kubenswrapper[4055]: I0313 01:11:45.081644 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 18:00:50.066870395 +0000 UTC Mar 13 01:11:45.081696 master-0 kubenswrapper[4055]: I0313 01:11:45.081688 4055 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 16h49m4.985185492s for next certificate rotation Mar 13 01:11:45.763949 master-0 kubenswrapper[4055]: I0313 01:11:45.763866 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:45.764210 master-0 kubenswrapper[4055]: E0313 01:11:45.763971 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:45.764210 master-0 kubenswrapper[4055]: E0313 01:11:45.764023 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:11:53.76400995 +0000 UTC m=+51.927068988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:46.832149 master-0 kubenswrapper[4055]: I0313 01:11:46.832069 4055 generic.go:334] "Generic (PLEG): container finished" podID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerID="7c07bc771c953fa9d34f82960a8b9fd12b63e9a86c930f999ffe77b37e0a74ef" exitCode=0 Mar 13 01:11:46.832149 master-0 kubenswrapper[4055]: I0313 01:11:46.832135 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-qpxft" event={"ID":"abf2ead5-b97d-4160-8120-28cb8a3d843e","Type":"ContainerDied","Data":"7c07bc771c953fa9d34f82960a8b9fd12b63e9a86c930f999ffe77b37e0a74ef"} Mar 13 01:11:47.866748 master-0 kubenswrapper[4055]: I0313 01:11:47.866687 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:47.981197 master-0 kubenswrapper[4055]: I0313 01:11:47.981087 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf\") pod \"abf2ead5-b97d-4160-8120-28cb8a3d843e\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " Mar 13 01:11:47.981197 master-0 kubenswrapper[4055]: I0313 01:11:47.981150 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf\") pod \"abf2ead5-b97d-4160-8120-28cb8a3d843e\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " Mar 13 01:11:47.981197 master-0 kubenswrapper[4055]: I0313 01:11:47.981194 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbwdr\" (UniqueName: \"kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr\") pod \"abf2ead5-b97d-4160-8120-28cb8a3d843e\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981211 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "abf2ead5-b97d-4160-8120-28cb8a3d843e" (UID: "abf2ead5-b97d-4160-8120-28cb8a3d843e"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981228 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle\") pod \"abf2ead5-b97d-4160-8120-28cb8a3d843e\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981297 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "abf2ead5-b97d-4160-8120-28cb8a3d843e" (UID: "abf2ead5-b97d-4160-8120-28cb8a3d843e"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981313 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "abf2ead5-b97d-4160-8120-28cb8a3d843e" (UID: "abf2ead5-b97d-4160-8120-28cb8a3d843e"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981367 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "abf2ead5-b97d-4160-8120-28cb8a3d843e" (UID: "abf2ead5-b97d-4160-8120-28cb8a3d843e"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981336 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files\") pod \"abf2ead5-b97d-4160-8120-28cb8a3d843e\" (UID: \"abf2ead5-b97d-4160-8120-28cb8a3d843e\") " Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981546 4055 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981577 4055 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981597 4055 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:11:47.981598 master-0 kubenswrapper[4055]: I0313 01:11:47.981615 4055 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/abf2ead5-b97d-4160-8120-28cb8a3d843e-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 13 01:11:47.986818 master-0 kubenswrapper[4055]: I0313 01:11:47.986755 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr" (OuterVolumeSpecName: "kube-api-access-hbwdr") pod "abf2ead5-b97d-4160-8120-28cb8a3d843e" (UID: "abf2ead5-b97d-4160-8120-28cb8a3d843e"). InnerVolumeSpecName "kube-api-access-hbwdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:11:48.082623 master-0 kubenswrapper[4055]: I0313 01:11:48.082582 4055 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbwdr\" (UniqueName: \"kubernetes.io/projected/abf2ead5-b97d-4160-8120-28cb8a3d843e-kube-api-access-hbwdr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:11:48.839703 master-0 kubenswrapper[4055]: I0313 01:11:48.839591 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-qpxft" event={"ID":"abf2ead5-b97d-4160-8120-28cb8a3d843e","Type":"ContainerDied","Data":"9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb"} Mar 13 01:11:48.839703 master-0 kubenswrapper[4055]: I0313 01:11:48.839698 4055 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb" Mar 13 01:11:48.840127 master-0 kubenswrapper[4055]: I0313 01:11:48.840097 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:11:53.821866 master-0 kubenswrapper[4055]: I0313 01:11:53.821755 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:11:53.822577 master-0 kubenswrapper[4055]: E0313 01:11:53.821920 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:53.822577 master-0 kubenswrapper[4055]: E0313 01:11:53.822003 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:09.821978343 +0000 UTC m=+67.985037411 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:11:54.679944 master-0 kubenswrapper[4055]: I0313 01:11:54.679899 4055 scope.go:117] "RemoveContainer" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" Mar 13 01:11:55.863037 master-0 kubenswrapper[4055]: I0313 01:11:55.862950 4055 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 01:11:55.864382 master-0 kubenswrapper[4055]: I0313 01:11:55.863780 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c03513c285f73dedbe67dedcdcf65f9b9e4e6c146f0a64e7433f278ca1844469"} Mar 13 01:11:56.868440 master-0 kubenswrapper[4055]: I0313 01:11:56.868333 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerStarted","Data":"ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd"} Mar 13 01:11:56.900128 master-0 kubenswrapper[4055]: I0313 01:11:56.900041 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=14.900016791 podStartE2EDuration="14.900016791s" podCreationTimestamp="2026-03-13 01:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:11:55.884187181 +0000 UTC m=+54.047246249" watchObservedRunningTime="2026-03-13 01:11:56.900016791 +0000 UTC m=+55.063075859" Mar 13 01:11:58.820429 master-0 kubenswrapper[4055]: I0313 01:11:58.820331 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" podStartSLOduration=18.422227292 podStartE2EDuration="21.820307557s" podCreationTimestamp="2026-03-13 01:11:37 +0000 UTC" firstStartedPulling="2026-03-13 01:11:38.408233726 +0000 UTC m=+36.571292794" lastFinishedPulling="2026-03-13 01:11:41.806314011 +0000 UTC m=+39.969373059" observedRunningTime="2026-03-13 01:11:56.899906118 +0000 UTC m=+55.062965186" watchObservedRunningTime="2026-03-13 01:11:58.820307557 +0000 UTC m=+56.983366605" Mar 13 01:11:58.821763 master-0 kubenswrapper[4055]: I0313 01:11:58.820592 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-9psj4"] Mar 13 01:11:58.821763 master-0 kubenswrapper[4055]: E0313 01:11:58.820677 4055 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:11:58.821763 master-0 kubenswrapper[4055]: I0313 01:11:58.820690 4055 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:11:58.821763 master-0 kubenswrapper[4055]: I0313 01:11:58.820714 4055 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:11:58.821763 master-0 kubenswrapper[4055]: I0313 01:11:58.820880 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:11:58.904940 master-0 kubenswrapper[4055]: I0313 01:11:58.904816 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvrd\" (UniqueName: \"kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd\") pod \"mtu-prober-9psj4\" (UID: \"d0725849-af6c-4399-9beb-8df68d80963f\") " pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:11:59.005615 master-0 kubenswrapper[4055]: I0313 01:11:59.005404 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvrd\" (UniqueName: \"kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd\") pod \"mtu-prober-9psj4\" (UID: \"d0725849-af6c-4399-9beb-8df68d80963f\") " pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:11:59.034976 master-0 kubenswrapper[4055]: I0313 01:11:59.034879 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvrd\" (UniqueName: \"kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd\") pod \"mtu-prober-9psj4\" (UID: \"d0725849-af6c-4399-9beb-8df68d80963f\") " pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:11:59.141927 master-0 kubenswrapper[4055]: I0313 01:11:59.141744 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:11:59.157438 master-0 kubenswrapper[4055]: W0313 01:11:59.157365 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0725849_af6c_4399_9beb_8df68d80963f.slice/crio-38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b WatchSource:0}: Error finding container 38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b: Status 404 returned error can't find the container with id 38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b Mar 13 01:11:59.879584 master-0 kubenswrapper[4055]: I0313 01:11:59.879393 4055 generic.go:334] "Generic (PLEG): container finished" podID="d0725849-af6c-4399-9beb-8df68d80963f" containerID="2d1ba7ec4846defd3b04a175ca5a3b9796ffce1a2ede0d1ea47e737fb6974a90" exitCode=0 Mar 13 01:11:59.879584 master-0 kubenswrapper[4055]: I0313 01:11:59.879462 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-9psj4" event={"ID":"d0725849-af6c-4399-9beb-8df68d80963f","Type":"ContainerDied","Data":"2d1ba7ec4846defd3b04a175ca5a3b9796ffce1a2ede0d1ea47e737fb6974a90"} Mar 13 01:11:59.879584 master-0 kubenswrapper[4055]: I0313 01:11:59.879500 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-9psj4" event={"ID":"d0725849-af6c-4399-9beb-8df68d80963f","Type":"ContainerStarted","Data":"38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b"} Mar 13 01:12:00.905277 master-0 kubenswrapper[4055]: I0313 01:12:00.905211 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:12:01.024263 master-0 kubenswrapper[4055]: I0313 01:12:01.024175 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzvrd\" (UniqueName: \"kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd\") pod \"d0725849-af6c-4399-9beb-8df68d80963f\" (UID: \"d0725849-af6c-4399-9beb-8df68d80963f\") " Mar 13 01:12:01.029103 master-0 kubenswrapper[4055]: I0313 01:12:01.029010 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd" (OuterVolumeSpecName: "kube-api-access-tzvrd") pod "d0725849-af6c-4399-9beb-8df68d80963f" (UID: "d0725849-af6c-4399-9beb-8df68d80963f"). InnerVolumeSpecName "kube-api-access-tzvrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:12:01.125521 master-0 kubenswrapper[4055]: I0313 01:12:01.125447 4055 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzvrd\" (UniqueName: \"kubernetes.io/projected/d0725849-af6c-4399-9beb-8df68d80963f-kube-api-access-tzvrd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:01.887183 master-0 kubenswrapper[4055]: I0313 01:12:01.887103 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-9psj4" event={"ID":"d0725849-af6c-4399-9beb-8df68d80963f","Type":"ContainerDied","Data":"38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b"} Mar 13 01:12:01.887183 master-0 kubenswrapper[4055]: I0313 01:12:01.887160 4055 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b" Mar 13 01:12:01.887183 master-0 kubenswrapper[4055]: I0313 01:12:01.887180 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-9psj4" Mar 13 01:12:03.832123 master-0 kubenswrapper[4055]: I0313 01:12:03.832030 4055 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-9psj4"] Mar 13 01:12:03.836775 master-0 kubenswrapper[4055]: I0313 01:12:03.836720 4055 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-9psj4"] Mar 13 01:12:04.685499 master-0 kubenswrapper[4055]: I0313 01:12:04.685400 4055 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0725849-af6c-4399-9beb-8df68d80963f" path="/var/lib/kubelet/pods/d0725849-af6c-4399-9beb-8df68d80963f/volumes" Mar 13 01:12:08.719851 master-0 kubenswrapper[4055]: I0313 01:12:08.719746 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rvt5h"] Mar 13 01:12:08.720679 master-0 kubenswrapper[4055]: E0313 01:12:08.719899 4055 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:12:08.720679 master-0 kubenswrapper[4055]: I0313 01:12:08.719928 4055 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:12:08.720679 master-0 kubenswrapper[4055]: I0313 01:12:08.719976 4055 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:12:08.720679 master-0 kubenswrapper[4055]: I0313 01:12:08.720282 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.724064 master-0 kubenswrapper[4055]: I0313 01:12:08.723992 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 01:12:08.724064 master-0 kubenswrapper[4055]: I0313 01:12:08.724052 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 01:12:08.726836 master-0 kubenswrapper[4055]: I0313 01:12:08.726545 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 01:12:08.730475 master-0 kubenswrapper[4055]: I0313 01:12:08.730419 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 01:12:08.786067 master-0 kubenswrapper[4055]: I0313 01:12:08.785981 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786067 master-0 kubenswrapper[4055]: I0313 01:12:08.786055 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786093 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786125 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786246 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786305 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786340 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786380 master-0 kubenswrapper[4055]: I0313 01:12:08.786378 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786413 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786446 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786477 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786505 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786536 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786602 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786660 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.786777 master-0 kubenswrapper[4055]: I0313 01:12:08.786730 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.787185 master-0 kubenswrapper[4055]: I0313 01:12:08.786799 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.887772 master-0 kubenswrapper[4055]: I0313 01:12:08.887679 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.887772 master-0 kubenswrapper[4055]: I0313 01:12:08.887778 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888088 master-0 kubenswrapper[4055]: I0313 01:12:08.887828 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888088 master-0 kubenswrapper[4055]: I0313 01:12:08.887873 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888088 master-0 kubenswrapper[4055]: I0313 01:12:08.887946 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888088 master-0 kubenswrapper[4055]: I0313 01:12:08.888013 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888104 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888145 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888195 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888225 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888234 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888261 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888322 master-0 kubenswrapper[4055]: I0313 01:12:08.888303 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888389 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888442 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888489 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888530 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888543 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888574 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888587 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888614 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888665 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888678 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888715 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.888765 master-0 kubenswrapper[4055]: I0313 01:12:08.888751 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888783 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888796 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888876 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888877 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888924 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.889515 master-0 kubenswrapper[4055]: I0313 01:12:08.888979 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.890053 master-0 kubenswrapper[4055]: I0313 01:12:08.889830 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.890053 master-0 kubenswrapper[4055]: I0313 01:12:08.889933 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.910196 master-0 kubenswrapper[4055]: I0313 01:12:08.910119 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xn5t5"] Mar 13 01:12:08.910987 master-0 kubenswrapper[4055]: I0313 01:12:08.910945 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.913692 master-0 kubenswrapper[4055]: I0313 01:12:08.913618 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 01:12:08.913835 master-0 kubenswrapper[4055]: I0313 01:12:08.913730 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 01:12:08.917186 master-0 kubenswrapper[4055]: I0313 01:12:08.917130 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:12:08.989881 master-0 kubenswrapper[4055]: I0313 01:12:08.989541 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990041 master-0 kubenswrapper[4055]: I0313 01:12:08.989906 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990041 master-0 kubenswrapper[4055]: I0313 01:12:08.989947 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990163 master-0 kubenswrapper[4055]: I0313 01:12:08.990022 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990163 master-0 kubenswrapper[4055]: I0313 01:12:08.990092 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990280 master-0 kubenswrapper[4055]: I0313 01:12:08.990161 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990280 master-0 kubenswrapper[4055]: I0313 01:12:08.990201 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:08.990280 master-0 kubenswrapper[4055]: I0313 01:12:08.990242 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.045244 master-0 kubenswrapper[4055]: I0313 01:12:09.045182 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rvt5h" Mar 13 01:12:09.062481 master-0 kubenswrapper[4055]: W0313 01:12:09.062409 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2937cbe2_3125_4c3f_96f8_2febeb5942cc.slice/crio-609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829 WatchSource:0}: Error finding container 609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829: Status 404 returned error can't find the container with id 609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829 Mar 13 01:12:09.091595 master-0 kubenswrapper[4055]: I0313 01:12:09.091485 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.091929 master-0 kubenswrapper[4055]: I0313 01:12:09.091684 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.091929 master-0 kubenswrapper[4055]: I0313 01:12:09.091799 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.091929 master-0 kubenswrapper[4055]: I0313 01:12:09.091894 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092124 master-0 kubenswrapper[4055]: I0313 01:12:09.091983 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092188 master-0 kubenswrapper[4055]: I0313 01:12:09.092119 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092188 master-0 kubenswrapper[4055]: I0313 01:12:09.092118 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092309 master-0 kubenswrapper[4055]: I0313 01:12:09.092166 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092309 master-0 kubenswrapper[4055]: I0313 01:12:09.092229 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092309 master-0 kubenswrapper[4055]: I0313 01:12:09.092193 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092466 master-0 kubenswrapper[4055]: I0313 01:12:09.092317 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.092466 master-0 kubenswrapper[4055]: I0313 01:12:09.092407 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.093609 master-0 kubenswrapper[4055]: I0313 01:12:09.093567 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.093762 master-0 kubenswrapper[4055]: I0313 01:12:09.093675 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.094703 master-0 kubenswrapper[4055]: I0313 01:12:09.093759 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.120240 master-0 kubenswrapper[4055]: I0313 01:12:09.120207 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.237928 master-0 kubenswrapper[4055]: I0313 01:12:09.237818 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:12:09.252039 master-0 kubenswrapper[4055]: W0313 01:12:09.251980 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4738c93d_62e6_44ce_a289_e646b9302e71.slice/crio-71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0 WatchSource:0}: Error finding container 71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0: Status 404 returned error can't find the container with id 71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0 Mar 13 01:12:09.701974 master-0 kubenswrapper[4055]: I0313 01:12:09.701851 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zh5fh"] Mar 13 01:12:09.703305 master-0 kubenswrapper[4055]: I0313 01:12:09.702442 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:09.703305 master-0 kubenswrapper[4055]: E0313 01:12:09.702585 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:09.799056 master-0 kubenswrapper[4055]: I0313 01:12:09.798930 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:09.799056 master-0 kubenswrapper[4055]: I0313 01:12:09.799002 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:09.899933 master-0 kubenswrapper[4055]: I0313 01:12:09.899841 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:12:09.899933 master-0 kubenswrapper[4055]: I0313 01:12:09.899888 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:09.899933 master-0 kubenswrapper[4055]: I0313 01:12:09.899911 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:09.900479 master-0 kubenswrapper[4055]: E0313 01:12:09.900391 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:12:09.900584 master-0 kubenswrapper[4055]: E0313 01:12:09.900507 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:09.900584 master-0 kubenswrapper[4055]: E0313 01:12:09.900560 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:41.900520441 +0000 UTC m=+100.063579519 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:12:09.900837 master-0 kubenswrapper[4055]: E0313 01:12:09.900596 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:10.400579553 +0000 UTC m=+68.563638631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:09.910806 master-0 kubenswrapper[4055]: I0313 01:12:09.910736 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvt5h" event={"ID":"2937cbe2-3125-4c3f-96f8-2febeb5942cc","Type":"ContainerStarted","Data":"609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829"} Mar 13 01:12:09.912381 master-0 kubenswrapper[4055]: I0313 01:12:09.912296 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerStarted","Data":"71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0"} Mar 13 01:12:09.925073 master-0 kubenswrapper[4055]: I0313 01:12:09.925021 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:10.403537 master-0 kubenswrapper[4055]: I0313 01:12:10.403428 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:10.403910 master-0 kubenswrapper[4055]: E0313 01:12:10.403692 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:10.403910 master-0 kubenswrapper[4055]: E0313 01:12:10.403842 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:11.403800497 +0000 UTC m=+69.566859595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:11.413697 master-0 kubenswrapper[4055]: I0313 01:12:11.412986 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:11.413697 master-0 kubenswrapper[4055]: E0313 01:12:11.413159 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:11.413697 master-0 kubenswrapper[4055]: E0313 01:12:11.413224 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:13.413202388 +0000 UTC m=+71.576261466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:11.678806 master-0 kubenswrapper[4055]: I0313 01:12:11.678690 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:11.679008 master-0 kubenswrapper[4055]: E0313 01:12:11.678820 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:13.430072 master-0 kubenswrapper[4055]: I0313 01:12:13.430010 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:13.430499 master-0 kubenswrapper[4055]: E0313 01:12:13.430211 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:13.430499 master-0 kubenswrapper[4055]: E0313 01:12:13.430335 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:17.430307525 +0000 UTC m=+75.593366583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:13.678625 master-0 kubenswrapper[4055]: I0313 01:12:13.678567 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:13.678830 master-0 kubenswrapper[4055]: E0313 01:12:13.678711 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:15.679437 master-0 kubenswrapper[4055]: I0313 01:12:15.679375 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:15.680400 master-0 kubenswrapper[4055]: E0313 01:12:15.679624 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:16.930559 master-0 kubenswrapper[4055]: I0313 01:12:16.930493 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f1844314bd4c14c44c294275c228ee201df2f8be5daa877db9d32b69fb506d82" exitCode=0 Mar 13 01:12:16.930559 master-0 kubenswrapper[4055]: I0313 01:12:16.930544 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"f1844314bd4c14c44c294275c228ee201df2f8be5daa877db9d32b69fb506d82"} Mar 13 01:12:17.465013 master-0 kubenswrapper[4055]: I0313 01:12:17.463388 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:17.465013 master-0 kubenswrapper[4055]: E0313 01:12:17.463577 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:17.465013 master-0 kubenswrapper[4055]: E0313 01:12:17.463680 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:25.463654521 +0000 UTC m=+83.626713589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:17.679333 master-0 kubenswrapper[4055]: I0313 01:12:17.679278 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:17.679601 master-0 kubenswrapper[4055]: E0313 01:12:17.679403 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:19.680011 master-0 kubenswrapper[4055]: I0313 01:12:19.679941 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:19.680487 master-0 kubenswrapper[4055]: E0313 01:12:19.680116 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:21.123833 master-0 kubenswrapper[4055]: I0313 01:12:21.123769 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd"] Mar 13 01:12:21.124947 master-0 kubenswrapper[4055]: I0313 01:12:21.124179 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.131824 master-0 kubenswrapper[4055]: I0313 01:12:21.131003 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 01:12:21.131824 master-0 kubenswrapper[4055]: I0313 01:12:21.131367 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 01:12:21.131824 master-0 kubenswrapper[4055]: I0313 01:12:21.131530 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 01:12:21.131824 master-0 kubenswrapper[4055]: I0313 01:12:21.131548 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 01:12:21.131824 master-0 kubenswrapper[4055]: I0313 01:12:21.131708 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 01:12:21.206122 master-0 kubenswrapper[4055]: I0313 01:12:21.206053 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.206122 master-0 kubenswrapper[4055]: I0313 01:12:21.206130 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.206447 master-0 kubenswrapper[4055]: I0313 01:12:21.206163 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.206447 master-0 kubenswrapper[4055]: I0313 01:12:21.206183 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.306690 master-0 kubenswrapper[4055]: I0313 01:12:21.306597 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.306899 master-0 kubenswrapper[4055]: I0313 01:12:21.306720 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.306899 master-0 kubenswrapper[4055]: I0313 01:12:21.306781 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.306899 master-0 kubenswrapper[4055]: I0313 01:12:21.306822 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.311905 master-0 kubenswrapper[4055]: I0313 01:12:21.308068 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.311905 master-0 kubenswrapper[4055]: I0313 01:12:21.308689 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.311905 master-0 kubenswrapper[4055]: I0313 01:12:21.311289 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.313128 master-0 kubenswrapper[4055]: I0313 01:12:21.313096 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5h8l9"] Mar 13 01:12:21.313802 master-0 kubenswrapper[4055]: I0313 01:12:21.313770 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.317559 master-0 kubenswrapper[4055]: I0313 01:12:21.317532 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:12:21.317559 master-0 kubenswrapper[4055]: I0313 01:12:21.317534 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:12:21.339143 master-0 kubenswrapper[4055]: I0313 01:12:21.339100 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407647 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407694 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407715 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407741 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407758 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407772 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407789 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407803 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.407804 master-0 kubenswrapper[4055]: I0313 01:12:21.407819 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407834 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407848 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407862 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407875 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407892 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407907 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407921 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407934 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407951 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407982 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.408450 master-0 kubenswrapper[4055]: I0313 01:12:21.407998 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2x5\" (UniqueName: \"kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.464956 master-0 kubenswrapper[4055]: I0313 01:12:21.464908 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:12:21.508511 master-0 kubenswrapper[4055]: I0313 01:12:21.508452 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508511 master-0 kubenswrapper[4055]: I0313 01:12:21.508491 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508806 master-0 kubenswrapper[4055]: I0313 01:12:21.508597 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508806 master-0 kubenswrapper[4055]: I0313 01:12:21.508749 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508923 master-0 kubenswrapper[4055]: I0313 01:12:21.508781 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508923 master-0 kubenswrapper[4055]: I0313 01:12:21.508839 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.508923 master-0 kubenswrapper[4055]: I0313 01:12:21.508894 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509097 master-0 kubenswrapper[4055]: I0313 01:12:21.508954 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509097 master-0 kubenswrapper[4055]: I0313 01:12:21.509014 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509097 master-0 kubenswrapper[4055]: I0313 01:12:21.509057 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509097 master-0 kubenswrapper[4055]: I0313 01:12:21.509067 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509325 master-0 kubenswrapper[4055]: I0313 01:12:21.509106 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2x5\" (UniqueName: \"kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509325 master-0 kubenswrapper[4055]: I0313 01:12:21.509145 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509445 master-0 kubenswrapper[4055]: I0313 01:12:21.509318 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509445 master-0 kubenswrapper[4055]: I0313 01:12:21.509366 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509445 master-0 kubenswrapper[4055]: I0313 01:12:21.509372 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509445 master-0 kubenswrapper[4055]: I0313 01:12:21.509400 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509699 master-0 kubenswrapper[4055]: I0313 01:12:21.509474 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509699 master-0 kubenswrapper[4055]: I0313 01:12:21.509530 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509699 master-0 kubenswrapper[4055]: I0313 01:12:21.509551 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509699 master-0 kubenswrapper[4055]: I0313 01:12:21.509555 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509935 master-0 kubenswrapper[4055]: I0313 01:12:21.509697 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509935 master-0 kubenswrapper[4055]: I0313 01:12:21.509819 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509935 master-0 kubenswrapper[4055]: I0313 01:12:21.509860 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509935 master-0 kubenswrapper[4055]: I0313 01:12:21.509893 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.509935 master-0 kubenswrapper[4055]: I0313 01:12:21.509914 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.509970 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.509926 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510011 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510027 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.509975 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510522 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510078 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510654 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510666 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.510739 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.514133 master-0 kubenswrapper[4055]: I0313 01:12:21.511737 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.517042 master-0 kubenswrapper[4055]: I0313 01:12:21.516954 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.526261 master-0 kubenswrapper[4055]: I0313 01:12:21.525840 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.528761 master-0 kubenswrapper[4055]: I0313 01:12:21.528697 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2x5\" (UniqueName: \"kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5\") pod \"ovnkube-node-5h8l9\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.629286 master-0 kubenswrapper[4055]: I0313 01:12:21.629100 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:21.679542 master-0 kubenswrapper[4055]: I0313 01:12:21.679351 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:21.679542 master-0 kubenswrapper[4055]: E0313 01:12:21.679480 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:23.679510 master-0 kubenswrapper[4055]: I0313 01:12:23.679459 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:23.679982 master-0 kubenswrapper[4055]: E0313 01:12:23.679655 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:24.289916 master-0 kubenswrapper[4055]: I0313 01:12:24.289843 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xs8pt"] Mar 13 01:12:24.290224 master-0 kubenswrapper[4055]: I0313 01:12:24.290204 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:24.290381 master-0 kubenswrapper[4055]: E0313 01:12:24.290348 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:24.435447 master-0 kubenswrapper[4055]: I0313 01:12:24.435395 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:24.536801 master-0 kubenswrapper[4055]: I0313 01:12:24.536739 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:24.555670 master-0 kubenswrapper[4055]: E0313 01:12:24.555538 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:24.555670 master-0 kubenswrapper[4055]: E0313 01:12:24.555577 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:24.555670 master-0 kubenswrapper[4055]: E0313 01:12:24.555593 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:24.555911 master-0 kubenswrapper[4055]: E0313 01:12:24.555683 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:25.055662623 +0000 UTC m=+83.218721661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:25.141128 master-0 kubenswrapper[4055]: I0313 01:12:25.140979 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:25.141128 master-0 kubenswrapper[4055]: E0313 01:12:25.141107 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:25.141128 master-0 kubenswrapper[4055]: E0313 01:12:25.141122 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:25.141128 master-0 kubenswrapper[4055]: E0313 01:12:25.141133 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:25.141996 master-0 kubenswrapper[4055]: E0313 01:12:25.141172 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:26.14115956 +0000 UTC m=+84.304218608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:25.544581 master-0 kubenswrapper[4055]: I0313 01:12:25.544547 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:25.544723 master-0 kubenswrapper[4055]: E0313 01:12:25.544689 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:25.544767 master-0 kubenswrapper[4055]: E0313 01:12:25.544737 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:41.544723154 +0000 UTC m=+99.707782192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:25.679507 master-0 kubenswrapper[4055]: I0313 01:12:25.678960 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:25.679507 master-0 kubenswrapper[4055]: I0313 01:12:25.678988 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:25.679507 master-0 kubenswrapper[4055]: E0313 01:12:25.679121 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:25.679507 master-0 kubenswrapper[4055]: E0313 01:12:25.679259 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:26.006223 master-0 kubenswrapper[4055]: W0313 01:12:26.006123 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1308fba1_a50d_48b3_b272_7bef44727b7f.slice/crio-f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb WatchSource:0}: Error finding container f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb: Status 404 returned error can't find the container with id f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb Mar 13 01:12:26.149578 master-0 kubenswrapper[4055]: I0313 01:12:26.149543 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:26.149917 master-0 kubenswrapper[4055]: E0313 01:12:26.149687 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:26.149917 master-0 kubenswrapper[4055]: E0313 01:12:26.149711 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:26.149917 master-0 kubenswrapper[4055]: E0313 01:12:26.149723 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:26.149917 master-0 kubenswrapper[4055]: E0313 01:12:26.149763 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:28.149749698 +0000 UTC m=+86.312808736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:26.893155 master-0 kubenswrapper[4055]: I0313 01:12:26.893091 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-znqwc"] Mar 13 01:12:26.898425 master-0 kubenswrapper[4055]: I0313 01:12:26.897052 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:26.901517 master-0 kubenswrapper[4055]: I0313 01:12:26.901426 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 01:12:26.901690 master-0 kubenswrapper[4055]: I0313 01:12:26.901670 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 01:12:26.902056 master-0 kubenswrapper[4055]: I0313 01:12:26.901803 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 01:12:26.902056 master-0 kubenswrapper[4055]: I0313 01:12:26.901845 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 01:12:26.902056 master-0 kubenswrapper[4055]: I0313 01:12:26.901931 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 01:12:26.960883 master-0 kubenswrapper[4055]: I0313 01:12:26.960788 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvt5h" event={"ID":"2937cbe2-3125-4c3f-96f8-2febeb5942cc","Type":"ContainerStarted","Data":"af254e609bac0d7dd38fb4c0fff04b08f4827ca415e3d02f5dce013a9d0ee8c7"} Mar 13 01:12:26.962485 master-0 kubenswrapper[4055]: I0313 01:12:26.962442 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" event={"ID":"33108f6a-7f8c-46c6-b93d-e5e820035677","Type":"ContainerStarted","Data":"51c9158b00e418db6c5c785afa10ff3ffa900bc70e33d3ed53306b1cd40de43f"} Mar 13 01:12:26.964446 master-0 kubenswrapper[4055]: I0313 01:12:26.964406 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"b1a3f0aaa5bfa331e385a253d06037bc576451a7d72651b9001c746da55121ba"} Mar 13 01:12:26.964446 master-0 kubenswrapper[4055]: I0313 01:12:26.964433 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb"} Mar 13 01:12:26.966666 master-0 kubenswrapper[4055]: I0313 01:12:26.966622 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="03b1433799f1c9507de93fbd689d37d0b962300c0b8274b036071bcf3cc09941" exitCode=0 Mar 13 01:12:26.966811 master-0 kubenswrapper[4055]: I0313 01:12:26.966672 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"03b1433799f1c9507de93fbd689d37d0b962300c0b8274b036071bcf3cc09941"} Mar 13 01:12:26.976771 master-0 kubenswrapper[4055]: I0313 01:12:26.976693 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rvt5h" podStartSLOduration=1.95446002 podStartE2EDuration="18.97661967s" podCreationTimestamp="2026-03-13 01:12:08 +0000 UTC" firstStartedPulling="2026-03-13 01:12:09.068431162 +0000 UTC m=+67.231490240" lastFinishedPulling="2026-03-13 01:12:26.090590852 +0000 UTC m=+84.253649890" observedRunningTime="2026-03-13 01:12:26.976349763 +0000 UTC m=+85.139408801" watchObservedRunningTime="2026-03-13 01:12:26.97661967 +0000 UTC m=+85.139678708" Mar 13 01:12:27.059650 master-0 kubenswrapper[4055]: I0313 01:12:27.059579 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.059650 master-0 kubenswrapper[4055]: I0313 01:12:27.059626 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.059886 master-0 kubenswrapper[4055]: I0313 01:12:27.059686 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.059886 master-0 kubenswrapper[4055]: I0313 01:12:27.059704 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.160854 master-0 kubenswrapper[4055]: I0313 01:12:27.160742 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.161340 master-0 kubenswrapper[4055]: I0313 01:12:27.161020 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.161340 master-0 kubenswrapper[4055]: I0313 01:12:27.161114 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.161340 master-0 kubenswrapper[4055]: I0313 01:12:27.161149 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.161658 master-0 kubenswrapper[4055]: E0313 01:12:27.161579 4055 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Mar 13 01:12:27.161773 master-0 kubenswrapper[4055]: E0313 01:12:27.161735 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert podName:d1153bb3-30dd-458f-b0a4-c05358a8b3f8 nodeName:}" failed. No retries permitted until 2026-03-13 01:12:27.66169968 +0000 UTC m=+85.824758748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert") pod "network-node-identity-znqwc" (UID: "d1153bb3-30dd-458f-b0a4-c05358a8b3f8") : secret "network-node-identity-cert" not found Mar 13 01:12:27.161814 master-0 kubenswrapper[4055]: I0313 01:12:27.161798 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.162854 master-0 kubenswrapper[4055]: I0313 01:12:27.162814 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.584512 master-0 kubenswrapper[4055]: I0313 01:12:27.584347 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.669828 master-0 kubenswrapper[4055]: I0313 01:12:27.669753 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.673392 master-0 kubenswrapper[4055]: I0313 01:12:27.673367 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.678911 master-0 kubenswrapper[4055]: I0313 01:12:27.678867 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:27.678993 master-0 kubenswrapper[4055]: I0313 01:12:27.678931 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:27.679088 master-0 kubenswrapper[4055]: E0313 01:12:27.679056 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:27.679285 master-0 kubenswrapper[4055]: E0313 01:12:27.679250 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:27.814853 master-0 kubenswrapper[4055]: I0313 01:12:27.814784 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:12:27.829407 master-0 kubenswrapper[4055]: W0313 01:12:27.829298 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1153bb3_30dd_458f_b0a4_c05358a8b3f8.slice/crio-a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d WatchSource:0}: Error finding container a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d: Status 404 returned error can't find the container with id a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d Mar 13 01:12:27.972736 master-0 kubenswrapper[4055]: I0313 01:12:27.972449 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d"} Mar 13 01:12:28.174533 master-0 kubenswrapper[4055]: I0313 01:12:28.174433 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:28.176964 master-0 kubenswrapper[4055]: E0313 01:12:28.174735 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:28.176964 master-0 kubenswrapper[4055]: E0313 01:12:28.174785 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:28.176964 master-0 kubenswrapper[4055]: E0313 01:12:28.174810 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:28.176964 master-0 kubenswrapper[4055]: E0313 01:12:28.174921 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:32.174888097 +0000 UTC m=+90.337947165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:29.679983 master-0 kubenswrapper[4055]: I0313 01:12:29.679033 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:29.679983 master-0 kubenswrapper[4055]: I0313 01:12:29.679112 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:29.679983 master-0 kubenswrapper[4055]: E0313 01:12:29.679166 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:29.679983 master-0 kubenswrapper[4055]: E0313 01:12:29.679272 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:30.993818 master-0 kubenswrapper[4055]: I0313 01:12:30.993343 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="569d90d03ceda29b5f6ff80b99725d90e6a4f9724473ba5d3146ac49efbbe232" exitCode=0 Mar 13 01:12:30.993818 master-0 kubenswrapper[4055]: I0313 01:12:30.993389 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"569d90d03ceda29b5f6ff80b99725d90e6a4f9724473ba5d3146ac49efbbe232"} Mar 13 01:12:31.678480 master-0 kubenswrapper[4055]: I0313 01:12:31.678425 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:31.678821 master-0 kubenswrapper[4055]: I0313 01:12:31.678477 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:31.678821 master-0 kubenswrapper[4055]: E0313 01:12:31.678580 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:31.678821 master-0 kubenswrapper[4055]: E0313 01:12:31.678700 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:32.213559 master-0 kubenswrapper[4055]: I0313 01:12:32.213507 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:32.215579 master-0 kubenswrapper[4055]: E0313 01:12:32.213673 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:32.215579 master-0 kubenswrapper[4055]: E0313 01:12:32.213689 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:32.215579 master-0 kubenswrapper[4055]: E0313 01:12:32.213699 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:32.215579 master-0 kubenswrapper[4055]: E0313 01:12:32.213745 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:40.213732007 +0000 UTC m=+98.376791045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:33.002402 master-0 kubenswrapper[4055]: I0313 01:12:33.002335 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="ac0c969b95b64c22e84de07c2976566813a316f1d691a27df3a1f4621768e238" exitCode=0 Mar 13 01:12:33.002695 master-0 kubenswrapper[4055]: I0313 01:12:33.002392 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"ac0c969b95b64c22e84de07c2976566813a316f1d691a27df3a1f4621768e238"} Mar 13 01:12:33.678505 master-0 kubenswrapper[4055]: I0313 01:12:33.678449 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:33.679154 master-0 kubenswrapper[4055]: I0313 01:12:33.678533 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:33.679154 master-0 kubenswrapper[4055]: E0313 01:12:33.678655 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:33.679154 master-0 kubenswrapper[4055]: E0313 01:12:33.678750 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:35.679119 master-0 kubenswrapper[4055]: I0313 01:12:35.679071 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:35.679119 master-0 kubenswrapper[4055]: I0313 01:12:35.679106 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:35.681100 master-0 kubenswrapper[4055]: E0313 01:12:35.679244 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:35.681100 master-0 kubenswrapper[4055]: E0313 01:12:35.679333 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:35.690100 master-0 kubenswrapper[4055]: I0313 01:12:35.690069 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:12:37.733011 master-0 kubenswrapper[4055]: I0313 01:12:37.732427 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:37.734049 master-0 kubenswrapper[4055]: I0313 01:12:37.732589 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:37.734049 master-0 kubenswrapper[4055]: E0313 01:12:37.733171 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:37.734049 master-0 kubenswrapper[4055]: E0313 01:12:37.733716 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:37.913457 master-0 kubenswrapper[4055]: W0313 01:12:37.913411 4055 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 01:12:37.914475 master-0 kubenswrapper[4055]: I0313 01:12:37.914438 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 01:12:39.679498 master-0 kubenswrapper[4055]: I0313 01:12:39.679411 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:39.680161 master-0 kubenswrapper[4055]: I0313 01:12:39.679428 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:39.680161 master-0 kubenswrapper[4055]: E0313 01:12:39.679571 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:39.680161 master-0 kubenswrapper[4055]: E0313 01:12:39.679666 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:40.256796 master-0 kubenswrapper[4055]: I0313 01:12:40.256664 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:40.257077 master-0 kubenswrapper[4055]: E0313 01:12:40.256844 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:40.257077 master-0 kubenswrapper[4055]: E0313 01:12:40.256876 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:40.257077 master-0 kubenswrapper[4055]: E0313 01:12:40.256893 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:40.257077 master-0 kubenswrapper[4055]: E0313 01:12:40.257005 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:12:56.256982671 +0000 UTC m=+114.420041719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:41.569495 master-0 kubenswrapper[4055]: I0313 01:12:41.569439 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:41.569972 master-0 kubenswrapper[4055]: E0313 01:12:41.569554 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:41.569972 master-0 kubenswrapper[4055]: E0313 01:12:41.569599 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:13.569584998 +0000 UTC m=+131.732644036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 01:12:41.678860 master-0 kubenswrapper[4055]: I0313 01:12:41.678817 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:41.679067 master-0 kubenswrapper[4055]: I0313 01:12:41.678879 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:41.679067 master-0 kubenswrapper[4055]: E0313 01:12:41.678921 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:41.679067 master-0 kubenswrapper[4055]: E0313 01:12:41.679047 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:41.972789 master-0 kubenswrapper[4055]: I0313 01:12:41.972715 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:12:41.973040 master-0 kubenswrapper[4055]: E0313 01:12:41.972931 4055 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:12:41.973106 master-0 kubenswrapper[4055]: E0313 01:12:41.973051 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:45.97302213 +0000 UTC m=+164.136081228 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:12:43.678622 master-0 kubenswrapper[4055]: I0313 01:12:43.678549 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:43.679096 master-0 kubenswrapper[4055]: E0313 01:12:43.678735 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:43.679175 master-0 kubenswrapper[4055]: I0313 01:12:43.679143 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:43.679250 master-0 kubenswrapper[4055]: E0313 01:12:43.679215 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:44.751140 master-0 kubenswrapper[4055]: I0313 01:12:44.747266 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=9.747246985 podStartE2EDuration="9.747246985s" podCreationTimestamp="2026-03-13 01:12:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:12:44.74705976 +0000 UTC m=+102.910118798" watchObservedRunningTime="2026-03-13 01:12:44.747246985 +0000 UTC m=+102.910306023" Mar 13 01:12:44.751140 master-0 kubenswrapper[4055]: I0313 01:12:44.747728 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=7.747721868 podStartE2EDuration="7.747721868s" podCreationTimestamp="2026-03-13 01:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:12:44.122389896 +0000 UTC m=+102.285448964" watchObservedRunningTime="2026-03-13 01:12:44.747721868 +0000 UTC m=+102.910780916" Mar 13 01:12:45.142341 master-0 kubenswrapper[4055]: I0313 01:12:45.142249 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 01:12:45.679404 master-0 kubenswrapper[4055]: I0313 01:12:45.679361 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:45.679565 master-0 kubenswrapper[4055]: I0313 01:12:45.679441 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:45.679565 master-0 kubenswrapper[4055]: E0313 01:12:45.679462 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:45.679626 master-0 kubenswrapper[4055]: E0313 01:12:45.679560 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:46.736485 master-0 kubenswrapper[4055]: I0313 01:12:46.736328 4055 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5h8l9"] Mar 13 01:12:47.678870 master-0 kubenswrapper[4055]: I0313 01:12:47.678809 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:47.679159 master-0 kubenswrapper[4055]: I0313 01:12:47.678881 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:47.679159 master-0 kubenswrapper[4055]: E0313 01:12:47.678928 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:47.679159 master-0 kubenswrapper[4055]: E0313 01:12:47.679044 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:49.047443 master-0 kubenswrapper[4055]: I0313 01:12:49.047147 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerStarted","Data":"24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796"} Mar 13 01:12:49.051677 master-0 kubenswrapper[4055]: I0313 01:12:49.048644 4055 generic.go:334] "Generic (PLEG): container finished" podID="33108f6a-7f8c-46c6-b93d-e5e820035677" containerID="c4adcf3517a76cd830fa2b69a25f434a614eff747270d594b06fbcfee5603947" exitCode=0 Mar 13 01:12:49.051677 master-0 kubenswrapper[4055]: I0313 01:12:49.048705 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" event={"ID":"33108f6a-7f8c-46c6-b93d-e5e820035677","Type":"ContainerDied","Data":"c4adcf3517a76cd830fa2b69a25f434a614eff747270d594b06fbcfee5603947"} Mar 13 01:12:49.051677 master-0 kubenswrapper[4055]: I0313 01:12:49.050183 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad"} Mar 13 01:12:49.052061 master-0 kubenswrapper[4055]: I0313 01:12:49.051787 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab"} Mar 13 01:12:49.052061 master-0 kubenswrapper[4055]: I0313 01:12:49.051805 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"38fb6af13de86e0d88a936d042a22e81831b9af5722fa376d1a0a0fe523b846b"} Mar 13 01:12:49.089537 master-0 kubenswrapper[4055]: I0313 01:12:49.089394 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=5.089372893 podStartE2EDuration="5.089372893s" podCreationTimestamp="2026-03-13 01:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:12:49.088296063 +0000 UTC m=+107.251355131" watchObservedRunningTime="2026-03-13 01:12:49.089372893 +0000 UTC m=+107.252431971" Mar 13 01:12:49.119068 master-0 kubenswrapper[4055]: I0313 01:12:49.118966 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:49.130343 master-0 kubenswrapper[4055]: I0313 01:12:49.130259 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" podStartSLOduration=5.668244319 podStartE2EDuration="28.130233506s" podCreationTimestamp="2026-03-13 01:12:21 +0000 UTC" firstStartedPulling="2026-03-13 01:12:26.206319681 +0000 UTC m=+84.369378709" lastFinishedPulling="2026-03-13 01:12:48.668308858 +0000 UTC m=+106.831367896" observedRunningTime="2026-03-13 01:12:49.129619049 +0000 UTC m=+107.292678137" watchObservedRunningTime="2026-03-13 01:12:49.130233506 +0000 UTC m=+107.293292584" Mar 13 01:12:49.146108 master-0 kubenswrapper[4055]: I0313 01:12:49.145999 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-znqwc" podStartSLOduration=2.265250177 podStartE2EDuration="23.145973187s" podCreationTimestamp="2026-03-13 01:12:26 +0000 UTC" firstStartedPulling="2026-03-13 01:12:27.831115716 +0000 UTC m=+85.994174794" lastFinishedPulling="2026-03-13 01:12:48.711838726 +0000 UTC m=+106.874897804" observedRunningTime="2026-03-13 01:12:49.145793902 +0000 UTC m=+107.308853000" watchObservedRunningTime="2026-03-13 01:12:49.145973187 +0000 UTC m=+107.309032235" Mar 13 01:12:49.180907 master-0 kubenswrapper[4055]: I0313 01:12:49.180857 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.181244 master-0 kubenswrapper[4055]: I0313 01:12:49.181214 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.181490 master-0 kubenswrapper[4055]: I0313 01:12:49.181468 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.181718 master-0 kubenswrapper[4055]: I0313 01:12:49.181030 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.181831 master-0 kubenswrapper[4055]: I0313 01:12:49.181390 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.181831 master-0 kubenswrapper[4055]: I0313 01:12:49.181592 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.181831 master-0 kubenswrapper[4055]: I0313 01:12:49.181789 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182044 master-0 kubenswrapper[4055]: I0313 01:12:49.181856 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182044 master-0 kubenswrapper[4055]: I0313 01:12:49.181888 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182044 master-0 kubenswrapper[4055]: I0313 01:12:49.181967 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182044 master-0 kubenswrapper[4055]: I0313 01:12:49.181980 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log" (OuterVolumeSpecName: "node-log") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182044 master-0 kubenswrapper[4055]: I0313 01:12:49.182008 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182311 master-0 kubenswrapper[4055]: I0313 01:12:49.182056 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182311 master-0 kubenswrapper[4055]: I0313 01:12:49.182248 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd2x5\" (UniqueName: \"kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182311 master-0 kubenswrapper[4055]: I0313 01:12:49.182306 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:12:49.182476 master-0 kubenswrapper[4055]: I0313 01:12:49.182315 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182476 master-0 kubenswrapper[4055]: I0313 01:12:49.182359 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182476 master-0 kubenswrapper[4055]: I0313 01:12:49.182405 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182476 master-0 kubenswrapper[4055]: I0313 01:12:49.182445 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182484 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182523 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182552 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182583 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182606 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182616 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182691 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182713 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182728 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket" (OuterVolumeSpecName: "log-socket") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182746 4055 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn\") pod \"33108f6a-7f8c-46c6-b93d-e5e820035677\" (UID: \"33108f6a-7f8c-46c6-b93d-e5e820035677\") " Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182761 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182797 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182792 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash" (OuterVolumeSpecName: "host-slash") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182824 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182870 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.182896 master-0 kubenswrapper[4055]: I0313 01:12:49.182886 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.182924 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.182970 4055 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.182987 4055 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.182998 4055 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183008 4055 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183016 4055 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-node-log\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183025 4055 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183035 4055 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183045 4055 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183054 4055 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183063 4055 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183070 4055 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183079 4055 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183087 4055 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183097 4055 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183107 4055 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.184122 master-0 kubenswrapper[4055]: I0313 01:12:49.183331 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:12:49.185365 master-0 kubenswrapper[4055]: I0313 01:12:49.184900 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:12:49.187484 master-0 kubenswrapper[4055]: I0313 01:12:49.187427 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:12:49.188830 master-0 kubenswrapper[4055]: I0313 01:12:49.188772 4055 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5" (OuterVolumeSpecName: "kube-api-access-nd2x5") pod "33108f6a-7f8c-46c6-b93d-e5e820035677" (UID: "33108f6a-7f8c-46c6-b93d-e5e820035677"). InnerVolumeSpecName "kube-api-access-nd2x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:12:49.283467 master-0 kubenswrapper[4055]: I0313 01:12:49.283351 4055 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd2x5\" (UniqueName: \"kubernetes.io/projected/33108f6a-7f8c-46c6-b93d-e5e820035677-kube-api-access-nd2x5\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.283467 master-0 kubenswrapper[4055]: I0313 01:12:49.283395 4055 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.283467 master-0 kubenswrapper[4055]: I0313 01:12:49.283410 4055 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33108f6a-7f8c-46c6-b93d-e5e820035677-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.283467 master-0 kubenswrapper[4055]: I0313 01:12:49.283425 4055 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33108f6a-7f8c-46c6-b93d-e5e820035677-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.283467 master-0 kubenswrapper[4055]: I0313 01:12:49.283441 4055 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33108f6a-7f8c-46c6-b93d-e5e820035677-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 01:12:49.679523 master-0 kubenswrapper[4055]: I0313 01:12:49.679459 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:49.679804 master-0 kubenswrapper[4055]: E0313 01:12:49.679675 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:49.679965 master-0 kubenswrapper[4055]: I0313 01:12:49.679453 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:49.680403 master-0 kubenswrapper[4055]: E0313 01:12:49.680306 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:50.060532 master-0 kubenswrapper[4055]: I0313 01:12:50.060469 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796" exitCode=0 Mar 13 01:12:50.061466 master-0 kubenswrapper[4055]: I0313 01:12:50.060604 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796"} Mar 13 01:12:50.064294 master-0 kubenswrapper[4055]: I0313 01:12:50.064225 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" event={"ID":"33108f6a-7f8c-46c6-b93d-e5e820035677","Type":"ContainerDied","Data":"51c9158b00e418db6c5c785afa10ff3ffa900bc70e33d3ed53306b1cd40de43f"} Mar 13 01:12:50.064433 master-0 kubenswrapper[4055]: I0313 01:12:50.064316 4055 scope.go:117] "RemoveContainer" containerID="c4adcf3517a76cd830fa2b69a25f434a614eff747270d594b06fbcfee5603947" Mar 13 01:12:50.064599 master-0 kubenswrapper[4055]: I0313 01:12:50.064558 4055 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5h8l9" Mar 13 01:12:50.157406 master-0 kubenswrapper[4055]: I0313 01:12:50.157317 4055 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5h8l9"] Mar 13 01:12:50.165088 master-0 kubenswrapper[4055]: I0313 01:12:50.164984 4055 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5h8l9"] Mar 13 01:12:50.171814 master-0 kubenswrapper[4055]: I0313 01:12:50.171752 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v56ct"] Mar 13 01:12:50.172019 master-0 kubenswrapper[4055]: E0313 01:12:50.171927 4055 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33108f6a-7f8c-46c6-b93d-e5e820035677" containerName="kubecfg-setup" Mar 13 01:12:50.172019 master-0 kubenswrapper[4055]: I0313 01:12:50.171960 4055 state_mem.go:107] "Deleted CPUSet assignment" podUID="33108f6a-7f8c-46c6-b93d-e5e820035677" containerName="kubecfg-setup" Mar 13 01:12:50.172208 master-0 kubenswrapper[4055]: I0313 01:12:50.172040 4055 memory_manager.go:354] "RemoveStaleState removing state" podUID="33108f6a-7f8c-46c6-b93d-e5e820035677" containerName="kubecfg-setup" Mar 13 01:12:50.173323 master-0 kubenswrapper[4055]: I0313 01:12:50.173277 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.175530 master-0 kubenswrapper[4055]: I0313 01:12:50.175477 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:12:50.177370 master-0 kubenswrapper[4055]: I0313 01:12:50.177291 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:12:50.194883 master-0 kubenswrapper[4055]: I0313 01:12:50.194787 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.194883 master-0 kubenswrapper[4055]: I0313 01:12:50.194826 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.194883 master-0 kubenswrapper[4055]: I0313 01:12:50.194850 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.194883 master-0 kubenswrapper[4055]: I0313 01:12:50.194870 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.194909 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.194930 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.194950 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.194967 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195010 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195034 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195053 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195077 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195101 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195122 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195140 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195200 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195221 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195240 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195261 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.195544 master-0 kubenswrapper[4055]: I0313 01:12:50.195282 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.296619 master-0 kubenswrapper[4055]: I0313 01:12:50.296557 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.296723 master-0 kubenswrapper[4055]: I0313 01:12:50.296650 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.296842 master-0 kubenswrapper[4055]: I0313 01:12:50.296772 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.296910 master-0 kubenswrapper[4055]: I0313 01:12:50.296881 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.296963 master-0 kubenswrapper[4055]: I0313 01:12:50.296938 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297115 master-0 kubenswrapper[4055]: I0313 01:12:50.297086 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297198 master-0 kubenswrapper[4055]: I0313 01:12:50.297167 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297237 master-0 kubenswrapper[4055]: I0313 01:12:50.297215 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297268 master-0 kubenswrapper[4055]: I0313 01:12:50.297214 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297298 master-0 kubenswrapper[4055]: I0313 01:12:50.297265 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297343 master-0 kubenswrapper[4055]: I0313 01:12:50.297319 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297422 master-0 kubenswrapper[4055]: I0313 01:12:50.297406 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297494 master-0 kubenswrapper[4055]: I0313 01:12:50.297482 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297574 master-0 kubenswrapper[4055]: I0313 01:12:50.297561 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297660 master-0 kubenswrapper[4055]: I0313 01:12:50.297645 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297763 master-0 kubenswrapper[4055]: I0313 01:12:50.297750 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297844 master-0 kubenswrapper[4055]: I0313 01:12:50.297577 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297844 master-0 kubenswrapper[4055]: I0313 01:12:50.297658 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297910 master-0 kubenswrapper[4055]: I0313 01:12:50.297829 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.297910 master-0 kubenswrapper[4055]: I0313 01:12:50.297563 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298013 master-0 kubenswrapper[4055]: I0313 01:12:50.298000 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298110 master-0 kubenswrapper[4055]: I0313 01:12:50.298097 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298220 master-0 kubenswrapper[4055]: I0313 01:12:50.298178 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298272 master-0 kubenswrapper[4055]: I0313 01:12:50.298235 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298302 master-0 kubenswrapper[4055]: I0313 01:12:50.298272 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298336 master-0 kubenswrapper[4055]: I0313 01:12:50.298295 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298336 master-0 kubenswrapper[4055]: I0313 01:12:50.298305 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298390 master-0 kubenswrapper[4055]: I0313 01:12:50.298351 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298390 master-0 kubenswrapper[4055]: I0313 01:12:50.298376 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298448 master-0 kubenswrapper[4055]: I0313 01:12:50.298429 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298516 master-0 kubenswrapper[4055]: I0313 01:12:50.298481 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298588 master-0 kubenswrapper[4055]: I0313 01:12:50.298561 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298623 master-0 kubenswrapper[4055]: I0313 01:12:50.298575 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298699 master-0 kubenswrapper[4055]: I0313 01:12:50.298671 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298737 master-0 kubenswrapper[4055]: I0313 01:12:50.298675 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298770 master-0 kubenswrapper[4055]: I0313 01:12:50.298732 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.298887 master-0 kubenswrapper[4055]: I0313 01:12:50.298860 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.299289 master-0 kubenswrapper[4055]: I0313 01:12:50.299248 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.303320 master-0 kubenswrapper[4055]: I0313 01:12:50.303284 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.328688 master-0 kubenswrapper[4055]: I0313 01:12:50.328563 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.501419 master-0 kubenswrapper[4055]: I0313 01:12:50.500935 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:50.519280 master-0 kubenswrapper[4055]: W0313 01:12:50.519245 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4edb3e1a_9082_4fc2_ae6f_99d49c078a34.slice/crio-4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777 WatchSource:0}: Error finding container 4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777: Status 404 returned error can't find the container with id 4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777 Mar 13 01:12:50.685494 master-0 kubenswrapper[4055]: I0313 01:12:50.685408 4055 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33108f6a-7f8c-46c6-b93d-e5e820035677" path="/var/lib/kubelet/pods/33108f6a-7f8c-46c6-b93d-e5e820035677/volumes" Mar 13 01:12:50.698022 master-0 kubenswrapper[4055]: I0313 01:12:50.697968 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 01:12:51.069627 master-0 kubenswrapper[4055]: I0313 01:12:51.069510 4055 generic.go:334] "Generic (PLEG): container finished" podID="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" containerID="d00fb05f88d59786ab92f821f00f790d94c0eeac3280854affdf40137d7e87d0" exitCode=0 Mar 13 01:12:51.070467 master-0 kubenswrapper[4055]: I0313 01:12:51.069620 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerDied","Data":"d00fb05f88d59786ab92f821f00f790d94c0eeac3280854affdf40137d7e87d0"} Mar 13 01:12:51.070467 master-0 kubenswrapper[4055]: I0313 01:12:51.069727 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777"} Mar 13 01:12:51.082554 master-0 kubenswrapper[4055]: I0313 01:12:51.082457 4055 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f64c75ed084248ad496cb98f6981ac7735f162ce7e7121ef5597b4e213d85ac5" exitCode=0 Mar 13 01:12:51.082710 master-0 kubenswrapper[4055]: I0313 01:12:51.082536 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"f64c75ed084248ad496cb98f6981ac7735f162ce7e7121ef5597b4e213d85ac5"} Mar 13 01:12:51.141295 master-0 kubenswrapper[4055]: I0313 01:12:51.141145 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.141114239 podStartE2EDuration="1.141114239s" podCreationTimestamp="2026-03-13 01:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:12:51.100386549 +0000 UTC m=+109.263445627" watchObservedRunningTime="2026-03-13 01:12:51.141114239 +0000 UTC m=+109.304173317" Mar 13 01:12:51.680424 master-0 kubenswrapper[4055]: I0313 01:12:51.679983 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:51.680648 master-0 kubenswrapper[4055]: E0313 01:12:51.680577 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:51.681051 master-0 kubenswrapper[4055]: I0313 01:12:51.680990 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:51.681234 master-0 kubenswrapper[4055]: E0313 01:12:51.681181 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:52.097945 master-0 kubenswrapper[4055]: I0313 01:12:52.097876 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"6f3c125406dd049ae146bd63c19d6c7751af1aaba13f654faef7c93feda70502"} Mar 13 01:12:52.097945 master-0 kubenswrapper[4055]: I0313 01:12:52.097940 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"058d442f8a7b28b68d71c3cb585c141212a638e67e2da68b2c8ad34aca404bce"} Mar 13 01:12:52.097945 master-0 kubenswrapper[4055]: I0313 01:12:52.097960 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"af9d5d0f91bc0e656e6c91ebb49055a49afe9bcd74078a12229c0c2fefb58c67"} Mar 13 01:12:52.099204 master-0 kubenswrapper[4055]: I0313 01:12:52.097977 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"2beb02912bdaac5a89bd9c0a35f1a0ecbbb1b7712fbf4f2d4b727635965b2220"} Mar 13 01:12:52.099204 master-0 kubenswrapper[4055]: I0313 01:12:52.097995 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"304c6fa572f5a2a3d04dd53d579bc265c0916e17447bcb96ea06ac71632cf34e"} Mar 13 01:12:52.099204 master-0 kubenswrapper[4055]: I0313 01:12:52.098013 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"e9580dbc7c8945af7e90dbb4d0ac5dd7e5416bc26b81c8682f1990f08179f549"} Mar 13 01:12:52.103668 master-0 kubenswrapper[4055]: I0313 01:12:52.103598 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerStarted","Data":"8eb67e42af9d810a3c1ea7a782f9e3f142578934d85597656291d9067249b1cf"} Mar 13 01:12:52.133208 master-0 kubenswrapper[4055]: I0313 01:12:52.133060 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" podStartSLOduration=4.790423586 podStartE2EDuration="44.133025621s" podCreationTimestamp="2026-03-13 01:12:08 +0000 UTC" firstStartedPulling="2026-03-13 01:12:09.254347936 +0000 UTC m=+67.417407004" lastFinishedPulling="2026-03-13 01:12:48.596949961 +0000 UTC m=+106.760009039" observedRunningTime="2026-03-13 01:12:52.131059136 +0000 UTC m=+110.294118214" watchObservedRunningTime="2026-03-13 01:12:52.133025621 +0000 UTC m=+110.296084689" Mar 13 01:12:53.678941 master-0 kubenswrapper[4055]: I0313 01:12:53.678448 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:53.679876 master-0 kubenswrapper[4055]: I0313 01:12:53.678495 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:53.679876 master-0 kubenswrapper[4055]: E0313 01:12:53.679034 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:53.679876 master-0 kubenswrapper[4055]: E0313 01:12:53.679114 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:55.121276 master-0 kubenswrapper[4055]: I0313 01:12:55.121183 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"30b9d5e645090bb14ab86716310f41fee385833f79403de6a30e627d0d0e329a"} Mar 13 01:12:55.679114 master-0 kubenswrapper[4055]: I0313 01:12:55.679031 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:55.679114 master-0 kubenswrapper[4055]: I0313 01:12:55.679080 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:55.679616 master-0 kubenswrapper[4055]: E0313 01:12:55.679204 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:55.679616 master-0 kubenswrapper[4055]: E0313 01:12:55.679365 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:56.355850 master-0 kubenswrapper[4055]: I0313 01:12:56.355778 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:56.356540 master-0 kubenswrapper[4055]: E0313 01:12:56.356030 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 01:12:56.356540 master-0 kubenswrapper[4055]: E0313 01:12:56.356076 4055 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 01:12:56.356540 master-0 kubenswrapper[4055]: E0313 01:12:56.356096 4055 projected.go:194] Error preparing data for projected volume kube-api-access-mnxgm for pod openshift-network-diagnostics/network-check-target-xs8pt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:56.356540 master-0 kubenswrapper[4055]: E0313 01:12:56.356175 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm podName:d5456c8b-3c98-4824-8700-a04e9c12fb2e nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.356150396 +0000 UTC m=+146.519209464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnxgm" (UniqueName: "kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm") pod "network-check-target-xs8pt" (UID: "d5456c8b-3c98-4824-8700-a04e9c12fb2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 01:12:57.132431 master-0 kubenswrapper[4055]: I0313 01:12:57.132078 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"a6f73296b53d2256d9223dd0adbce2bbf7733e0e109d3be3cdcc7fc586852e7f"} Mar 13 01:12:57.133577 master-0 kubenswrapper[4055]: I0313 01:12:57.132628 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:57.133577 master-0 kubenswrapper[4055]: I0313 01:12:57.132837 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:57.133577 master-0 kubenswrapper[4055]: I0313 01:12:57.132861 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:57.159477 master-0 kubenswrapper[4055]: I0313 01:12:57.159422 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:57.163746 master-0 kubenswrapper[4055]: I0313 01:12:57.163706 4055 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:12:57.173048 master-0 kubenswrapper[4055]: I0313 01:12:57.172765 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" podStartSLOduration=7.17273484 podStartE2EDuration="7.17273484s" podCreationTimestamp="2026-03-13 01:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:12:57.172234326 +0000 UTC m=+115.335293434" watchObservedRunningTime="2026-03-13 01:12:57.17273484 +0000 UTC m=+115.335793918" Mar 13 01:12:57.678749 master-0 kubenswrapper[4055]: I0313 01:12:57.678683 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:57.679685 master-0 kubenswrapper[4055]: E0313 01:12:57.678875 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:57.679815 master-0 kubenswrapper[4055]: I0313 01:12:57.679775 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:57.680132 master-0 kubenswrapper[4055]: E0313 01:12:57.680098 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:12:58.912660 master-0 kubenswrapper[4055]: I0313 01:12:58.912557 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xs8pt"] Mar 13 01:12:58.913479 master-0 kubenswrapper[4055]: I0313 01:12:58.912773 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:12:58.913479 master-0 kubenswrapper[4055]: E0313 01:12:58.912914 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:12:58.915403 master-0 kubenswrapper[4055]: I0313 01:12:58.915341 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zh5fh"] Mar 13 01:12:58.915531 master-0 kubenswrapper[4055]: I0313 01:12:58.915459 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:12:58.915677 master-0 kubenswrapper[4055]: E0313 01:12:58.915589 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:13:00.679409 master-0 kubenswrapper[4055]: I0313 01:13:00.679328 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:00.680132 master-0 kubenswrapper[4055]: E0313 01:13:00.679519 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:13:00.680132 master-0 kubenswrapper[4055]: I0313 01:13:00.679781 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:00.680132 master-0 kubenswrapper[4055]: E0313 01:13:00.679923 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:13:02.557424 master-0 kubenswrapper[4055]: E0313 01:13:02.557342 4055 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 01:13:02.679467 master-0 kubenswrapper[4055]: I0313 01:13:02.679347 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:02.679467 master-0 kubenswrapper[4055]: I0313 01:13:02.679410 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:02.680898 master-0 kubenswrapper[4055]: E0313 01:13:02.680809 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:13:02.681021 master-0 kubenswrapper[4055]: E0313 01:13:02.680990 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:13:02.743823 master-0 kubenswrapper[4055]: E0313 01:13:02.743759 4055 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 01:13:04.679406 master-0 kubenswrapper[4055]: I0313 01:13:04.679067 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:04.680284 master-0 kubenswrapper[4055]: E0313 01:13:04.679503 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:13:04.680284 master-0 kubenswrapper[4055]: I0313 01:13:04.679675 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:04.680284 master-0 kubenswrapper[4055]: E0313 01:13:04.679805 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:13:06.679663 master-0 kubenswrapper[4055]: I0313 01:13:06.679553 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:06.680494 master-0 kubenswrapper[4055]: E0313 01:13:06.679782 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zh5fh" podUID="e68ab3cb-c372-45d9-a758-beaf4c213714" Mar 13 01:13:06.680494 master-0 kubenswrapper[4055]: I0313 01:13:06.680039 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:06.680494 master-0 kubenswrapper[4055]: E0313 01:13:06.680114 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xs8pt" podUID="d5456c8b-3c98-4824-8700-a04e9c12fb2e" Mar 13 01:13:08.679327 master-0 kubenswrapper[4055]: I0313 01:13:08.679242 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:08.680174 master-0 kubenswrapper[4055]: I0313 01:13:08.679321 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:08.681937 master-0 kubenswrapper[4055]: I0313 01:13:08.681876 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 01:13:08.682668 master-0 kubenswrapper[4055]: I0313 01:13:08.682573 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 01:13:08.682899 master-0 kubenswrapper[4055]: I0313 01:13:08.682837 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 01:13:09.854077 master-0 kubenswrapper[4055]: I0313 01:13:09.854012 4055 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 13 01:13:10.308948 master-0 kubenswrapper[4055]: I0313 01:13:10.308898 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr"] Mar 13 01:13:10.309696 master-0 kubenswrapper[4055]: I0313 01:13:10.309666 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn"] Mar 13 01:13:10.310178 master-0 kubenswrapper[4055]: I0313 01:13:10.310149 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.310725 master-0 kubenswrapper[4055]: I0313 01:13:10.310625 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh"] Mar 13 01:13:10.310995 master-0 kubenswrapper[4055]: I0313 01:13:10.310961 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.312765 master-0 kubenswrapper[4055]: I0313 01:13:10.311102 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.313805 master-0 kubenswrapper[4055]: I0313 01:13:10.313734 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-kdn2l"] Mar 13 01:13:10.314475 master-0 kubenswrapper[4055]: I0313 01:13:10.314420 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.319153 master-0 kubenswrapper[4055]: I0313 01:13:10.315066 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8"] Mar 13 01:13:10.319153 master-0 kubenswrapper[4055]: I0313 01:13:10.315545 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.320669 master-0 kubenswrapper[4055]: I0313 01:13:10.320577 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 01:13:10.320669 master-0 kubenswrapper[4055]: I0313 01:13:10.320597 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 01:13:10.320972 master-0 kubenswrapper[4055]: I0313 01:13:10.320938 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 01:13:10.322577 master-0 kubenswrapper[4055]: I0313 01:13:10.322541 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 01:13:10.323158 master-0 kubenswrapper[4055]: I0313 01:13:10.323126 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 01:13:10.323894 master-0 kubenswrapper[4055]: I0313 01:13:10.323861 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 01:13:10.324181 master-0 kubenswrapper[4055]: I0313 01:13:10.324138 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 01:13:10.324408 master-0 kubenswrapper[4055]: I0313 01:13:10.323977 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.324584 master-0 kubenswrapper[4055]: I0313 01:13:10.324542 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 01:13:10.324735 master-0 kubenswrapper[4055]: I0313 01:13:10.324582 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6"] Mar 13 01:13:10.324857 master-0 kubenswrapper[4055]: I0313 01:13:10.324817 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 01:13:10.325119 master-0 kubenswrapper[4055]: I0313 01:13:10.324072 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 01:13:10.325293 master-0 kubenswrapper[4055]: I0313 01:13:10.325243 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:13:10.325396 master-0 kubenswrapper[4055]: I0313 01:13:10.325133 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg"] Mar 13 01:13:10.325396 master-0 kubenswrapper[4055]: I0313 01:13:10.325326 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 01:13:10.325550 master-0 kubenswrapper[4055]: I0313 01:13:10.324481 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 01:13:10.325662 master-0 kubenswrapper[4055]: I0313 01:13:10.325255 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.326230 master-0 kubenswrapper[4055]: I0313 01:13:10.326175 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.326909 master-0 kubenswrapper[4055]: I0313 01:13:10.326857 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t"] Mar 13 01:13:10.334233 master-0 kubenswrapper[4055]: I0313 01:13:10.327334 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.335174 master-0 kubenswrapper[4055]: I0313 01:13:10.335111 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 01:13:10.335595 master-0 kubenswrapper[4055]: I0313 01:13:10.335528 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.335760 master-0 kubenswrapper[4055]: I0313 01:13:10.335584 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8"] Mar 13 01:13:10.337126 master-0 kubenswrapper[4055]: I0313 01:13:10.337066 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g"] Mar 13 01:13:10.337303 master-0 kubenswrapper[4055]: I0313 01:13:10.337262 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.341159 master-0 kubenswrapper[4055]: I0313 01:13:10.338130 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 01:13:10.341159 master-0 kubenswrapper[4055]: I0313 01:13:10.338856 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.341159 master-0 kubenswrapper[4055]: I0313 01:13:10.340448 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 01:13:10.345764 master-0 kubenswrapper[4055]: I0313 01:13:10.342483 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 01:13:10.345764 master-0 kubenswrapper[4055]: I0313 01:13:10.342676 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 01:13:10.345764 master-0 kubenswrapper[4055]: I0313 01:13:10.342921 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.346112 master-0 kubenswrapper[4055]: I0313 01:13:10.345824 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 01:13:10.346112 master-0 kubenswrapper[4055]: I0313 01:13:10.346025 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 01:13:10.346112 master-0 kubenswrapper[4055]: I0313 01:13:10.346100 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 01:13:10.358886 master-0 kubenswrapper[4055]: I0313 01:13:10.357159 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-dszg5"] Mar 13 01:13:10.358886 master-0 kubenswrapper[4055]: I0313 01:13:10.358534 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj"] Mar 13 01:13:10.358886 master-0 kubenswrapper[4055]: I0313 01:13:10.358776 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.359710 master-0 kubenswrapper[4055]: I0313 01:13:10.359664 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.366755 master-0 kubenswrapper[4055]: I0313 01:13:10.364527 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf"] Mar 13 01:13:10.366755 master-0 kubenswrapper[4055]: I0313 01:13:10.364998 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2"] Mar 13 01:13:10.366755 master-0 kubenswrapper[4055]: I0313 01:13:10.365378 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.366755 master-0 kubenswrapper[4055]: I0313 01:13:10.365788 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.366755 master-0 kubenswrapper[4055]: I0313 01:13:10.365861 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.367957 master-0 kubenswrapper[4055]: I0313 01:13:10.367915 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8"] Mar 13 01:13:10.368223 master-0 kubenswrapper[4055]: I0313 01:13:10.368187 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 01:13:10.368288 master-0 kubenswrapper[4055]: I0313 01:13:10.368276 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-qvl2k"] Mar 13 01:13:10.368340 master-0 kubenswrapper[4055]: I0313 01:13:10.368326 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 01:13:10.368453 master-0 kubenswrapper[4055]: I0313 01:13:10.368424 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 01:13:10.368758 master-0 kubenswrapper[4055]: I0313 01:13:10.368724 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.368996 master-0 kubenswrapper[4055]: I0313 01:13:10.368961 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.370030 master-0 kubenswrapper[4055]: I0313 01:13:10.369115 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 01:13:10.370225 master-0 kubenswrapper[4055]: I0313 01:13:10.370071 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 01:13:10.370459 master-0 kubenswrapper[4055]: I0313 01:13:10.370409 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 01:13:10.370525 master-0 kubenswrapper[4055]: I0313 01:13:10.370501 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 01:13:10.370525 master-0 kubenswrapper[4055]: I0313 01:13:10.370510 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq"] Mar 13 01:13:10.370674 master-0 kubenswrapper[4055]: I0313 01:13:10.370621 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 01:13:10.370852 master-0 kubenswrapper[4055]: I0313 01:13:10.370692 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 01:13:10.370917 master-0 kubenswrapper[4055]: I0313 01:13:10.370887 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.370917 master-0 kubenswrapper[4055]: I0313 01:13:10.370897 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 01:13:10.371018 master-0 kubenswrapper[4055]: I0313 01:13:10.370942 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:13:10.371018 master-0 kubenswrapper[4055]: I0313 01:13:10.370799 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 01:13:10.371117 master-0 kubenswrapper[4055]: I0313 01:13:10.370799 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 01:13:10.371117 master-0 kubenswrapper[4055]: I0313 01:13:10.371060 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 01:13:10.371117 master-0 kubenswrapper[4055]: I0313 01:13:10.370802 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 01:13:10.372544 master-0 kubenswrapper[4055]: I0313 01:13:10.372498 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 01:13:10.372920 master-0 kubenswrapper[4055]: I0313 01:13:10.372811 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 01:13:10.374756 master-0 kubenswrapper[4055]: I0313 01:13:10.374710 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9"] Mar 13 01:13:10.375278 master-0 kubenswrapper[4055]: I0313 01:13:10.375237 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.377259 master-0 kubenswrapper[4055]: I0313 01:13:10.377091 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt"] Mar 13 01:13:10.378042 master-0 kubenswrapper[4055]: I0313 01:13:10.378003 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.378344 master-0 kubenswrapper[4055]: I0313 01:13:10.378303 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 01:13:10.378508 master-0 kubenswrapper[4055]: I0313 01:13:10.378470 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 01:13:10.378726 master-0 kubenswrapper[4055]: I0313 01:13:10.378670 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 01:13:10.378808 master-0 kubenswrapper[4055]: I0313 01:13:10.378749 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 01:13:10.379095 master-0 kubenswrapper[4055]: I0313 01:13:10.379054 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.379390 master-0 kubenswrapper[4055]: I0313 01:13:10.379351 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 01:13:10.379493 master-0 kubenswrapper[4055]: I0313 01:13:10.379459 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 01:13:10.379602 master-0 kubenswrapper[4055]: I0313 01:13:10.379571 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 01:13:10.379723 master-0 kubenswrapper[4055]: I0313 01:13:10.379709 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 01:13:10.379784 master-0 kubenswrapper[4055]: I0313 01:13:10.379727 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2"] Mar 13 01:13:10.379867 master-0 kubenswrapper[4055]: I0313 01:13:10.379847 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 01:13:10.380065 master-0 kubenswrapper[4055]: I0313 01:13:10.380015 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.380569 master-0 kubenswrapper[4055]: I0313 01:13:10.380524 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 01:13:10.380569 master-0 kubenswrapper[4055]: I0313 01:13:10.380558 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 01:13:10.380853 master-0 kubenswrapper[4055]: I0313 01:13:10.380802 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.381022 master-0 kubenswrapper[4055]: I0313 01:13:10.380984 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.381570 master-0 kubenswrapper[4055]: I0313 01:13:10.380995 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 01:13:10.381688 master-0 kubenswrapper[4055]: I0313 01:13:10.381043 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 01:13:10.381844 master-0 kubenswrapper[4055]: I0313 01:13:10.381804 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz"] Mar 13 01:13:10.381844 master-0 kubenswrapper[4055]: I0313 01:13:10.381834 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 01:13:10.381951 master-0 kubenswrapper[4055]: I0313 01:13:10.381902 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 01:13:10.382089 master-0 kubenswrapper[4055]: I0313 01:13:10.382051 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 01:13:10.382182 master-0 kubenswrapper[4055]: I0313 01:13:10.382161 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 01:13:10.382290 master-0 kubenswrapper[4055]: I0313 01:13:10.382256 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.382290 master-0 kubenswrapper[4055]: I0313 01:13:10.382167 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.382964 master-0 kubenswrapper[4055]: I0313 01:13:10.382927 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 01:13:10.383095 master-0 kubenswrapper[4055]: I0313 01:13:10.383059 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm"] Mar 13 01:13:10.383182 master-0 kubenswrapper[4055]: I0313 01:13:10.383162 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.383313 master-0 kubenswrapper[4055]: I0313 01:13:10.383271 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 01:13:10.383650 master-0 kubenswrapper[4055]: I0313 01:13:10.383548 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.383744 master-0 kubenswrapper[4055]: I0313 01:13:10.383565 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 01:13:10.386067 master-0 kubenswrapper[4055]: I0313 01:13:10.385887 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx"] Mar 13 01:13:10.386276 master-0 kubenswrapper[4055]: I0313 01:13:10.386159 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:10.387515 master-0 kubenswrapper[4055]: I0313 01:13:10.387314 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.387607 master-0 kubenswrapper[4055]: I0313 01:13:10.387577 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 01:13:10.388131 master-0 kubenswrapper[4055]: I0313 01:13:10.388098 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.388867 master-0 kubenswrapper[4055]: I0313 01:13:10.388828 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.388990 master-0 kubenswrapper[4055]: I0313 01:13:10.388190 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 01:13:10.389051 master-0 kubenswrapper[4055]: I0313 01:13:10.389023 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.389091 master-0 kubenswrapper[4055]: I0313 01:13:10.388960 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.389137 master-0 kubenswrapper[4055]: I0313 01:13:10.388225 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 01:13:10.389173 master-0 kubenswrapper[4055]: I0313 01:13:10.389126 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.389242 master-0 kubenswrapper[4055]: I0313 01:13:10.389219 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.389292 master-0 kubenswrapper[4055]: I0313 01:13:10.389256 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.389292 master-0 kubenswrapper[4055]: I0313 01:13:10.389284 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.389369 master-0 kubenswrapper[4055]: I0313 01:13:10.389309 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.389369 master-0 kubenswrapper[4055]: I0313 01:13:10.389334 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.389369 master-0 kubenswrapper[4055]: I0313 01:13:10.389357 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.389477 master-0 kubenswrapper[4055]: I0313 01:13:10.389382 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.389477 master-0 kubenswrapper[4055]: I0313 01:13:10.389405 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.389477 master-0 kubenswrapper[4055]: I0313 01:13:10.389431 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.389477 master-0 kubenswrapper[4055]: I0313 01:13:10.389452 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389476 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389497 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389520 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389545 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389569 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.389616 master-0 kubenswrapper[4055]: I0313 01:13:10.389592 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.389854 master-0 kubenswrapper[4055]: I0313 01:13:10.389709 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.389854 master-0 kubenswrapper[4055]: I0313 01:13:10.389761 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:13:10.389854 master-0 kubenswrapper[4055]: I0313 01:13:10.389776 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.389854 master-0 kubenswrapper[4055]: I0313 01:13:10.389805 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.389854 master-0 kubenswrapper[4055]: I0313 01:13:10.389833 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.390042 master-0 kubenswrapper[4055]: I0313 01:13:10.389869 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.390042 master-0 kubenswrapper[4055]: I0313 01:13:10.389903 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.390042 master-0 kubenswrapper[4055]: I0313 01:13:10.389938 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.390691 master-0 kubenswrapper[4055]: I0313 01:13:10.390595 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.390691 master-0 kubenswrapper[4055]: I0313 01:13:10.390672 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.390824 master-0 kubenswrapper[4055]: I0313 01:13:10.390800 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.390874 master-0 kubenswrapper[4055]: I0313 01:13:10.390862 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.390916 master-0 kubenswrapper[4055]: I0313 01:13:10.390888 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.391860 master-0 kubenswrapper[4055]: I0313 01:13:10.391827 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.391925 master-0 kubenswrapper[4055]: I0313 01:13:10.391866 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj"] Mar 13 01:13:10.391925 master-0 kubenswrapper[4055]: I0313 01:13:10.391882 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.391996 master-0 kubenswrapper[4055]: I0313 01:13:10.391938 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.392503 master-0 kubenswrapper[4055]: I0313 01:13:10.392294 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.392669 master-0 kubenswrapper[4055]: I0313 01:13:10.392523 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.392737 master-0 kubenswrapper[4055]: I0313 01:13:10.392559 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.393256 master-0 kubenswrapper[4055]: I0313 01:13:10.393219 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.393309 master-0 kubenswrapper[4055]: I0313 01:13:10.393287 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.393507 master-0 kubenswrapper[4055]: I0313 01:13:10.393486 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.393697 master-0 kubenswrapper[4055]: I0313 01:13:10.393674 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:13:10.394702 master-0 kubenswrapper[4055]: I0313 01:13:10.394680 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.396088 master-0 kubenswrapper[4055]: I0313 01:13:10.396041 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 01:13:10.396159 master-0 kubenswrapper[4055]: I0313 01:13:10.396106 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 01:13:10.396202 master-0 kubenswrapper[4055]: I0313 01:13:10.396162 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.396899 master-0 kubenswrapper[4055]: I0313 01:13:10.396881 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 01:13:10.397046 master-0 kubenswrapper[4055]: I0313 01:13:10.396889 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 01:13:10.397100 master-0 kubenswrapper[4055]: I0313 01:13:10.397063 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 01:13:10.397667 master-0 kubenswrapper[4055]: I0313 01:13:10.397625 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 01:13:10.398506 master-0 kubenswrapper[4055]: I0313 01:13:10.398469 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 01:13:10.399828 master-0 kubenswrapper[4055]: I0313 01:13:10.399792 4055 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 01:13:10.399890 master-0 kubenswrapper[4055]: I0313 01:13:10.399848 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 01:13:10.400048 master-0 kubenswrapper[4055]: I0313 01:13:10.399998 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 01:13:10.400768 master-0 kubenswrapper[4055]: I0313 01:13:10.400133 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 01:13:10.401445 master-0 kubenswrapper[4055]: I0313 01:13:10.401413 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:13:10.406696 master-0 kubenswrapper[4055]: I0313 01:13:10.406317 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 01:13:10.410356 master-0 kubenswrapper[4055]: I0313 01:13:10.410300 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn"] Mar 13 01:13:10.415218 master-0 kubenswrapper[4055]: I0313 01:13:10.415174 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 01:13:10.422521 master-0 kubenswrapper[4055]: I0313 01:13:10.422488 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-kdn2l"] Mar 13 01:13:10.432458 master-0 kubenswrapper[4055]: I0313 01:13:10.423154 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr"] Mar 13 01:13:10.432458 master-0 kubenswrapper[4055]: I0313 01:13:10.426800 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6"] Mar 13 01:13:10.432458 master-0 kubenswrapper[4055]: I0313 01:13:10.431076 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg"] Mar 13 01:13:10.434416 master-0 kubenswrapper[4055]: I0313 01:13:10.432707 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8"] Mar 13 01:13:10.445263 master-0 kubenswrapper[4055]: I0313 01:13:10.445210 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8"] Mar 13 01:13:10.445263 master-0 kubenswrapper[4055]: I0313 01:13:10.445257 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-dszg5"] Mar 13 01:13:10.446509 master-0 kubenswrapper[4055]: I0313 01:13:10.445483 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2"] Mar 13 01:13:10.449689 master-0 kubenswrapper[4055]: I0313 01:13:10.448467 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm"] Mar 13 01:13:10.449689 master-0 kubenswrapper[4055]: I0313 01:13:10.449342 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt"] Mar 13 01:13:10.451315 master-0 kubenswrapper[4055]: I0313 01:13:10.451290 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq"] Mar 13 01:13:10.456809 master-0 kubenswrapper[4055]: I0313 01:13:10.456752 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx"] Mar 13 01:13:10.457383 master-0 kubenswrapper[4055]: I0313 01:13:10.457334 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:13:10.459020 master-0 kubenswrapper[4055]: I0313 01:13:10.458982 4055 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-qclwv"] Mar 13 01:13:10.459551 master-0 kubenswrapper[4055]: I0313 01:13:10.459511 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.461881 master-0 kubenswrapper[4055]: I0313 01:13:10.461855 4055 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 01:13:10.462813 master-0 kubenswrapper[4055]: I0313 01:13:10.462768 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf"] Mar 13 01:13:10.465552 master-0 kubenswrapper[4055]: I0313 01:13:10.465518 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh"] Mar 13 01:13:10.475016 master-0 kubenswrapper[4055]: I0313 01:13:10.474972 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj"] Mar 13 01:13:10.475939 master-0 kubenswrapper[4055]: I0313 01:13:10.475885 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz"] Mar 13 01:13:10.476714 master-0 kubenswrapper[4055]: I0313 01:13:10.476582 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8"] Mar 13 01:13:10.477345 master-0 kubenswrapper[4055]: I0313 01:13:10.477306 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g"] Mar 13 01:13:10.478113 master-0 kubenswrapper[4055]: I0313 01:13:10.478059 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-qvl2k"] Mar 13 01:13:10.480176 master-0 kubenswrapper[4055]: I0313 01:13:10.480141 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2"] Mar 13 01:13:10.480176 master-0 kubenswrapper[4055]: I0313 01:13:10.480178 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj"] Mar 13 01:13:10.482000 master-0 kubenswrapper[4055]: I0313 01:13:10.481504 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t"] Mar 13 01:13:10.483474 master-0 kubenswrapper[4055]: I0313 01:13:10.483065 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9"] Mar 13 01:13:10.494206 master-0 kubenswrapper[4055]: I0313 01:13:10.494058 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.494206 master-0 kubenswrapper[4055]: I0313 01:13:10.494111 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.494316 master-0 kubenswrapper[4055]: E0313 01:13:10.494239 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:10.494316 master-0 kubenswrapper[4055]: E0313 01:13:10.494302 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:10.99428019 +0000 UTC m=+129.157339228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:10.494516 master-0 kubenswrapper[4055]: E0313 01:13:10.494497 4055 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:10.494627 master-0 kubenswrapper[4055]: E0313 01:13:10.494616 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:10.994596149 +0000 UTC m=+129.157655187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:10.494709 master-0 kubenswrapper[4055]: I0313 01:13:10.494523 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.494814 master-0 kubenswrapper[4055]: I0313 01:13:10.494796 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.496032 master-0 kubenswrapper[4055]: E0313 01:13:10.495185 4055 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:10.496032 master-0 kubenswrapper[4055]: E0313 01:13:10.495232 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:10.995219886 +0000 UTC m=+129.158278934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:10.496175 master-0 kubenswrapper[4055]: I0313 01:13:10.495055 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.497049 master-0 kubenswrapper[4055]: I0313 01:13:10.496410 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.497143 master-0 kubenswrapper[4055]: I0313 01:13:10.497130 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.497230 master-0 kubenswrapper[4055]: I0313 01:13:10.497216 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.497309 master-0 kubenswrapper[4055]: I0313 01:13:10.497297 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.497398 master-0 kubenswrapper[4055]: I0313 01:13:10.497386 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.497492 master-0 kubenswrapper[4055]: I0313 01:13:10.497476 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.497567 master-0 kubenswrapper[4055]: I0313 01:13:10.497554 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.497652 master-0 kubenswrapper[4055]: I0313 01:13:10.497623 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.497731 master-0 kubenswrapper[4055]: I0313 01:13:10.497719 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.497818 master-0 kubenswrapper[4055]: I0313 01:13:10.497806 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.497902 master-0 kubenswrapper[4055]: I0313 01:13:10.497889 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.497972 master-0 kubenswrapper[4055]: I0313 01:13:10.497958 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.498064 master-0 kubenswrapper[4055]: I0313 01:13:10.498051 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.498208 master-0 kubenswrapper[4055]: I0313 01:13:10.498165 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.498861 master-0 kubenswrapper[4055]: I0313 01:13:10.498822 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.498928 master-0 kubenswrapper[4055]: I0313 01:13:10.498902 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.498975 master-0 kubenswrapper[4055]: I0313 01:13:10.498954 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.499019 master-0 kubenswrapper[4055]: I0313 01:13:10.498998 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.499077 master-0 kubenswrapper[4055]: I0313 01:13:10.499057 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.499112 master-0 kubenswrapper[4055]: I0313 01:13:10.499099 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.499153 master-0 kubenswrapper[4055]: I0313 01:13:10.499133 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.499188 master-0 kubenswrapper[4055]: I0313 01:13:10.499167 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.499253 master-0 kubenswrapper[4055]: I0313 01:13:10.499222 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.499408 master-0 kubenswrapper[4055]: I0313 01:13:10.499367 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:10.499496 master-0 kubenswrapper[4055]: I0313 01:13:10.499473 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.499588 master-0 kubenswrapper[4055]: E0313 01:13:10.499562 4055 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:10.499691 master-0 kubenswrapper[4055]: I0313 01:13:10.499663 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.499735 master-0 kubenswrapper[4055]: I0313 01:13:10.499715 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.499823 master-0 kubenswrapper[4055]: I0313 01:13:10.499802 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.499861 master-0 kubenswrapper[4055]: I0313 01:13:10.499840 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.499893 master-0 kubenswrapper[4055]: I0313 01:13:10.499875 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.499926 master-0 kubenswrapper[4055]: I0313 01:13:10.499906 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.499954 master-0 kubenswrapper[4055]: I0313 01:13:10.499930 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.499984 master-0 kubenswrapper[4055]: I0313 01:13:10.499958 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.500013 master-0 kubenswrapper[4055]: I0313 01:13:10.499983 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.500042 master-0 kubenswrapper[4055]: I0313 01:13:10.500011 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.500042 master-0 kubenswrapper[4055]: I0313 01:13:10.500038 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.500101 master-0 kubenswrapper[4055]: I0313 01:13:10.500068 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.500101 master-0 kubenswrapper[4055]: I0313 01:13:10.500093 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.500179 master-0 kubenswrapper[4055]: I0313 01:13:10.500136 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.500179 master-0 kubenswrapper[4055]: I0313 01:13:10.500165 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.500239 master-0 kubenswrapper[4055]: I0313 01:13:10.500197 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.500239 master-0 kubenswrapper[4055]: I0313 01:13:10.500223 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.500291 master-0 kubenswrapper[4055]: I0313 01:13:10.500246 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.500321 master-0 kubenswrapper[4055]: I0313 01:13:10.500290 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.500321 master-0 kubenswrapper[4055]: I0313 01:13:10.500309 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.500378 master-0 kubenswrapper[4055]: I0313 01:13:10.500339 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.500378 master-0 kubenswrapper[4055]: I0313 01:13:10.500365 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.500436 master-0 kubenswrapper[4055]: I0313 01:13:10.500373 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.500436 master-0 kubenswrapper[4055]: I0313 01:13:10.500394 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.500511 master-0 kubenswrapper[4055]: E0313 01:13:10.500473 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.000457123 +0000 UTC m=+129.163516161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:10.501085 master-0 kubenswrapper[4055]: I0313 01:13:10.501058 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.501125 master-0 kubenswrapper[4055]: I0313 01:13:10.501106 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.501161 master-0 kubenswrapper[4055]: I0313 01:13:10.501139 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.501250 master-0 kubenswrapper[4055]: I0313 01:13:10.501219 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.501287 master-0 kubenswrapper[4055]: I0313 01:13:10.501271 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.501337 master-0 kubenswrapper[4055]: I0313 01:13:10.501316 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.501389 master-0 kubenswrapper[4055]: I0313 01:13:10.501368 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.501695 master-0 kubenswrapper[4055]: I0313 01:13:10.501665 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.502046 master-0 kubenswrapper[4055]: I0313 01:13:10.502019 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.502765 master-0 kubenswrapper[4055]: I0313 01:13:10.502739 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.504212 master-0 kubenswrapper[4055]: I0313 01:13:10.504184 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.504306 master-0 kubenswrapper[4055]: E0313 01:13:10.504286 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:10.504367 master-0 kubenswrapper[4055]: E0313 01:13:10.504352 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.004325491 +0000 UTC m=+129.167384529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:10.504799 master-0 kubenswrapper[4055]: I0313 01:13:10.504773 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.504910 master-0 kubenswrapper[4055]: I0313 01:13:10.504892 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.505017 master-0 kubenswrapper[4055]: I0313 01:13:10.505003 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.505108 master-0 kubenswrapper[4055]: I0313 01:13:10.505095 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.505206 master-0 kubenswrapper[4055]: I0313 01:13:10.505188 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.505397 master-0 kubenswrapper[4055]: I0313 01:13:10.505363 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.505444 master-0 kubenswrapper[4055]: I0313 01:13:10.505411 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.505444 master-0 kubenswrapper[4055]: I0313 01:13:10.505439 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.505498 master-0 kubenswrapper[4055]: I0313 01:13:10.505480 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.505818 master-0 kubenswrapper[4055]: I0313 01:13:10.505504 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.505818 master-0 kubenswrapper[4055]: I0313 01:13:10.505532 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.505818 master-0 kubenswrapper[4055]: I0313 01:13:10.505569 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.505818 master-0 kubenswrapper[4055]: I0313 01:13:10.505639 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.506021 master-0 kubenswrapper[4055]: E0313 01:13:10.505999 4055 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:10.506069 master-0 kubenswrapper[4055]: E0313 01:13:10.506056 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.006037999 +0000 UTC m=+129.169097037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:10.506176 master-0 kubenswrapper[4055]: E0313 01:13:10.506153 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:10.506217 master-0 kubenswrapper[4055]: I0313 01:13:10.506190 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.506217 master-0 kubenswrapper[4055]: I0313 01:13:10.504906 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.506355 master-0 kubenswrapper[4055]: E0313 01:13:10.506205 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.006187193 +0000 UTC m=+129.169246231 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:10.507291 master-0 kubenswrapper[4055]: I0313 01:13:10.507260 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.508366 master-0 kubenswrapper[4055]: I0313 01:13:10.508321 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.508417 master-0 kubenswrapper[4055]: I0313 01:13:10.508386 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.508447 master-0 kubenswrapper[4055]: I0313 01:13:10.508423 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.508491 master-0 kubenswrapper[4055]: I0313 01:13:10.508457 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.508523 master-0 kubenswrapper[4055]: I0313 01:13:10.508503 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.508553 master-0 kubenswrapper[4055]: I0313 01:13:10.508542 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.508815 master-0 kubenswrapper[4055]: I0313 01:13:10.508800 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.508922 master-0 kubenswrapper[4055]: I0313 01:13:10.508887 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.508966 master-0 kubenswrapper[4055]: I0313 01:13:10.508902 4055 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.508998 master-0 kubenswrapper[4055]: I0313 01:13:10.508967 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.509728 master-0 kubenswrapper[4055]: I0313 01:13:10.509711 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.510552 master-0 kubenswrapper[4055]: I0313 01:13:10.510525 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.511142 master-0 kubenswrapper[4055]: E0313 01:13:10.510986 4055 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:10.511142 master-0 kubenswrapper[4055]: E0313 01:13:10.511084 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.011048969 +0000 UTC m=+129.174108027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.514343 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.518205 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.520300 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.520738 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.522933 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.524047 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:10.530677 master-0 kubenswrapper[4055]: I0313 01:13:10.525159 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.531696 master-0 kubenswrapper[4055]: I0313 01:13:10.531676 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.539680 master-0 kubenswrapper[4055]: I0313 01:13:10.532451 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.544094 master-0 kubenswrapper[4055]: I0313 01:13:10.536274 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.544207 master-0 kubenswrapper[4055]: I0313 01:13:10.536504 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:10.544207 master-0 kubenswrapper[4055]: I0313 01:13:10.538055 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:10.544207 master-0 kubenswrapper[4055]: I0313 01:13:10.538790 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:10.544207 master-0 kubenswrapper[4055]: I0313 01:13:10.538984 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:10.544720 master-0 kubenswrapper[4055]: I0313 01:13:10.544690 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.546360 master-0 kubenswrapper[4055]: I0313 01:13:10.545986 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.551173 master-0 kubenswrapper[4055]: I0313 01:13:10.551140 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.551219 master-0 kubenswrapper[4055]: I0313 01:13:10.551180 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:10.575817 master-0 kubenswrapper[4055]: I0313 01:13:10.575762 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.586662 master-0 kubenswrapper[4055]: I0313 01:13:10.586040 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:10.609728 master-0 kubenswrapper[4055]: I0313 01:13:10.609666 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.609808 master-0 kubenswrapper[4055]: I0313 01:13:10.609728 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:10.609808 master-0 kubenswrapper[4055]: I0313 01:13:10.609761 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.609808 master-0 kubenswrapper[4055]: I0313 01:13:10.609787 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.609901 master-0 kubenswrapper[4055]: I0313 01:13:10.609810 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.609901 master-0 kubenswrapper[4055]: I0313 01:13:10.609835 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: E0313 01:13:10.609972 4055 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: E0313 01:13:10.610054 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.11003405 +0000 UTC m=+129.273093088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: I0313 01:13:10.610382 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: I0313 01:13:10.610434 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: I0313 01:13:10.610469 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.610496 master-0 kubenswrapper[4055]: I0313 01:13:10.610492 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.610690 master-0 kubenswrapper[4055]: I0313 01:13:10.610533 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.610690 master-0 kubenswrapper[4055]: I0313 01:13:10.610565 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.610690 master-0 kubenswrapper[4055]: I0313 01:13:10.610590 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.610690 master-0 kubenswrapper[4055]: I0313 01:13:10.610609 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.610690 master-0 kubenswrapper[4055]: I0313 01:13:10.610662 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.610826 master-0 kubenswrapper[4055]: I0313 01:13:10.610755 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.610857 master-0 kubenswrapper[4055]: I0313 01:13:10.610829 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.610888 master-0 kubenswrapper[4055]: I0313 01:13:10.610863 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.610915 master-0 kubenswrapper[4055]: I0313 01:13:10.610895 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.610944 master-0 kubenswrapper[4055]: I0313 01:13:10.610924 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.611025 master-0 kubenswrapper[4055]: I0313 01:13:10.610974 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.611066 master-0 kubenswrapper[4055]: I0313 01:13:10.611025 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.611098 master-0 kubenswrapper[4055]: I0313 01:13:10.611064 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.611299 master-0 kubenswrapper[4055]: I0313 01:13:10.611128 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.611299 master-0 kubenswrapper[4055]: I0313 01:13:10.611178 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.611299 master-0 kubenswrapper[4055]: I0313 01:13:10.611198 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.611299 master-0 kubenswrapper[4055]: I0313 01:13:10.611271 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611306 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611329 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611335 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611347 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611367 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611385 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.611419 master-0 kubenswrapper[4055]: I0313 01:13:10.611411 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.611596 master-0 kubenswrapper[4055]: I0313 01:13:10.611433 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.611596 master-0 kubenswrapper[4055]: I0313 01:13:10.611467 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.611596 master-0 kubenswrapper[4055]: I0313 01:13:10.611494 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.611596 master-0 kubenswrapper[4055]: I0313 01:13:10.611520 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.611596 master-0 kubenswrapper[4055]: I0313 01:13:10.611576 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.611735 master-0 kubenswrapper[4055]: I0313 01:13:10.611618 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.611735 master-0 kubenswrapper[4055]: I0313 01:13:10.611667 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.611951 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.611982 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: E0313 01:13:10.612000 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: E0313 01:13:10.612049 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.112031655 +0000 UTC m=+129.275090813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.612437 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.612657 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.612673 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.612776 master-0 kubenswrapper[4055]: I0313 01:13:10.612709 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.613193 master-0 kubenswrapper[4055]: I0313 01:13:10.613057 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.613193 master-0 kubenswrapper[4055]: I0313 01:13:10.613129 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.613498 master-0 kubenswrapper[4055]: I0313 01:13:10.613459 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.613553 master-0 kubenswrapper[4055]: E0313 01:13:10.613535 4055 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:10.613602 master-0 kubenswrapper[4055]: E0313 01:13:10.613586 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.113571459 +0000 UTC m=+129.276630547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:10.613855 master-0 kubenswrapper[4055]: I0313 01:13:10.613818 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.613943 master-0 kubenswrapper[4055]: E0313 01:13:10.613925 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:10.613987 master-0 kubenswrapper[4055]: E0313 01:13:10.613973 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.113953519 +0000 UTC m=+129.277012677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:10.614059 master-0 kubenswrapper[4055]: I0313 01:13:10.614037 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.614099 master-0 kubenswrapper[4055]: E0313 01:13:10.614075 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:10.614366 master-0 kubenswrapper[4055]: I0313 01:13:10.614338 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.614415 master-0 kubenswrapper[4055]: E0313 01:13:10.614405 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.114392402 +0000 UTC m=+129.277451560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:10.614448 master-0 kubenswrapper[4055]: E0313 01:13:10.614415 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:10.614477 master-0 kubenswrapper[4055]: E0313 01:13:10.614456 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:11.114445593 +0000 UTC m=+129.277504751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:10.614763 master-0 kubenswrapper[4055]: I0313 01:13:10.614715 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.615283 master-0 kubenswrapper[4055]: I0313 01:13:10.615197 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.616163 master-0 kubenswrapper[4055]: I0313 01:13:10.615275 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.616163 master-0 kubenswrapper[4055]: I0313 01:13:10.615952 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.616163 master-0 kubenswrapper[4055]: I0313 01:13:10.616025 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.618361 master-0 kubenswrapper[4055]: I0313 01:13:10.616820 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.618361 master-0 kubenswrapper[4055]: I0313 01:13:10.616823 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.618361 master-0 kubenswrapper[4055]: I0313 01:13:10.617533 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.618361 master-0 kubenswrapper[4055]: I0313 01:13:10.618210 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.650196 master-0 kubenswrapper[4055]: I0313 01:13:10.650157 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:10.663533 master-0 kubenswrapper[4055]: I0313 01:13:10.663506 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:10.667208 master-0 kubenswrapper[4055]: I0313 01:13:10.667181 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:10.686390 master-0 kubenswrapper[4055]: I0313 01:13:10.686360 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.692259 master-0 kubenswrapper[4055]: I0313 01:13:10.692234 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:10.707877 master-0 kubenswrapper[4055]: I0313 01:13:10.707836 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.733674 master-0 kubenswrapper[4055]: I0313 01:13:10.728843 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.733674 master-0 kubenswrapper[4055]: I0313 01:13:10.732272 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:10.747751 master-0 kubenswrapper[4055]: I0313 01:13:10.747712 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:10.747970 master-0 kubenswrapper[4055]: I0313 01:13:10.747938 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:10.768597 master-0 kubenswrapper[4055]: I0313 01:13:10.768538 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.796473 master-0 kubenswrapper[4055]: I0313 01:13:10.789008 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:10.814218 master-0 kubenswrapper[4055]: I0313 01:13:10.802491 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.830715 master-0 kubenswrapper[4055]: I0313 01:13:10.827784 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:10.842853 master-0 kubenswrapper[4055]: I0313 01:13:10.842412 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.854083 master-0 kubenswrapper[4055]: I0313 01:13:10.853707 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:10.867438 master-0 kubenswrapper[4055]: I0313 01:13:10.867401 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.873422 master-0 kubenswrapper[4055]: I0313 01:13:10.872002 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:10.884915 master-0 kubenswrapper[4055]: I0313 01:13:10.884887 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:10.887416 master-0 kubenswrapper[4055]: I0313 01:13:10.887391 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:10.906933 master-0 kubenswrapper[4055]: I0313 01:13:10.906890 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:10.916695 master-0 kubenswrapper[4055]: I0313 01:13:10.915075 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:10.935914 master-0 kubenswrapper[4055]: I0313 01:13:10.935381 4055 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:10.938699 master-0 kubenswrapper[4055]: I0313 01:13:10.937719 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:10.945011 master-0 kubenswrapper[4055]: I0313 01:13:10.944282 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:10.957465 master-0 kubenswrapper[4055]: I0313 01:13:10.957428 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:10.975231 master-0 kubenswrapper[4055]: I0313 01:13:10.975193 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh"] Mar 13 01:13:10.989746 master-0 kubenswrapper[4055]: I0313 01:13:10.989051 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn"] Mar 13 01:13:11.003212 master-0 kubenswrapper[4055]: W0313 01:13:11.003171 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22587300_2448_4862_9fd8_68197d17a9f2.slice/crio-7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00 WatchSource:0}: Error finding container 7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00: Status 404 returned error can't find the container with id 7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00 Mar 13 01:13:11.021534 master-0 kubenswrapper[4055]: E0313 01:13:11.021165 4055 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:11.021534 master-0 kubenswrapper[4055]: E0313 01:13:11.021246 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.021225868 +0000 UTC m=+130.184284906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.020968 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023764 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023821 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023853 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023875 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023911 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023930 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: I0313 01:13:11.023953 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.024084 4055 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.024128 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.024114319 +0000 UTC m=+130.187173357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.024166 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.024183 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.024177861 +0000 UTC m=+130.187236899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.026247 4055 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.026306 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.02629024 +0000 UTC m=+130.189349268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:11.027686 master-0 kubenswrapper[4055]: E0313 01:13:11.026368 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026394 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.026384443 +0000 UTC m=+130.189443481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026431 4055 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026449 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.026442954 +0000 UTC m=+130.189501992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026483 4055 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026501 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.026495736 +0000 UTC m=+130.189554774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026534 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:11.028175 master-0 kubenswrapper[4055]: E0313 01:13:11.026567 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.026556067 +0000 UTC m=+130.189615105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:11.033522 master-0 kubenswrapper[4055]: I0313 01:13:11.030038 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg"] Mar 13 01:13:11.044080 master-0 kubenswrapper[4055]: I0313 01:13:11.043727 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6"] Mar 13 01:13:11.074069 master-0 kubenswrapper[4055]: I0313 01:13:11.073399 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g"] Mar 13 01:13:11.108886 master-0 kubenswrapper[4055]: W0313 01:13:11.108853 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d4e785_6663_417d_b380_6905773613c8.slice/crio-2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534 WatchSource:0}: Error finding container 2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534: Status 404 returned error can't find the container with id 2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534 Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.124939 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.124985 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.125013 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.125037 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.125059 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:11.125165 master-0 kubenswrapper[4055]: I0313 01:13:11.125082 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:11.125385 master-0 kubenswrapper[4055]: E0313 01:13:11.125254 4055 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:11.125690 master-0 kubenswrapper[4055]: E0313 01:13:11.125671 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:11.125741 master-0 kubenswrapper[4055]: E0313 01:13:11.125728 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.125681562 +0000 UTC m=+130.288740670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:11.125776 master-0 kubenswrapper[4055]: E0313 01:13:11.125748 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.125741113 +0000 UTC m=+130.288800151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:11.125879 master-0 kubenswrapper[4055]: E0313 01:13:11.125859 4055 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:11.125913 master-0 kubenswrapper[4055]: E0313 01:13:11.125889 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.125880837 +0000 UTC m=+130.288939875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:11.125971 master-0 kubenswrapper[4055]: E0313 01:13:11.125953 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:11.126014 master-0 kubenswrapper[4055]: E0313 01:13:11.125980 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.1259719 +0000 UTC m=+130.289030938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:11.126059 master-0 kubenswrapper[4055]: E0313 01:13:11.126044 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:11.126092 master-0 kubenswrapper[4055]: E0313 01:13:11.126064 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.126058012 +0000 UTC m=+130.289117050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:11.127213 master-0 kubenswrapper[4055]: E0313 01:13:11.126127 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:11.127213 master-0 kubenswrapper[4055]: E0313 01:13:11.126154 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:12.126147465 +0000 UTC m=+130.289206503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:11.169465 master-0 kubenswrapper[4055]: I0313 01:13:11.169426 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:11.180860 master-0 kubenswrapper[4055]: I0313 01:13:11.180809 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8"] Mar 13 01:13:11.188103 master-0 kubenswrapper[4055]: I0313 01:13:11.186942 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerStarted","Data":"7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db"} Mar 13 01:13:11.188103 master-0 kubenswrapper[4055]: W0313 01:13:11.188049 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2fa86b_a966_49dc_8577_d2b54b111d14.slice/crio-d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848 WatchSource:0}: Error finding container d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848: Status 404 returned error can't find the container with id d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848 Mar 13 01:13:11.188726 master-0 kubenswrapper[4055]: I0313 01:13:11.188264 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerStarted","Data":"6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207"} Mar 13 01:13:11.190718 master-0 kubenswrapper[4055]: I0313 01:13:11.190687 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerStarted","Data":"ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c"} Mar 13 01:13:11.192028 master-0 kubenswrapper[4055]: I0313 01:13:11.191997 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerStarted","Data":"7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00"} Mar 13 01:13:11.193614 master-0 kubenswrapper[4055]: I0313 01:13:11.193586 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerStarted","Data":"2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534"} Mar 13 01:13:11.194528 master-0 kubenswrapper[4055]: I0313 01:13:11.194504 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qclwv" event={"ID":"46662e51-44af-4732-83a1-9509a579b373","Type":"ContainerStarted","Data":"7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101"} Mar 13 01:13:11.212594 master-0 kubenswrapper[4055]: I0313 01:13:11.212556 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt"] Mar 13 01:13:11.214335 master-0 kubenswrapper[4055]: I0313 01:13:11.214312 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2"] Mar 13 01:13:11.223575 master-0 kubenswrapper[4055]: W0313 01:13:11.223523 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fb4b86_f978_4ae1_80bc_18d2f386cbc2.slice/crio-26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81 WatchSource:0}: Error finding container 26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81: Status 404 returned error can't find the container with id 26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81 Mar 13 01:13:11.226170 master-0 kubenswrapper[4055]: I0313 01:13:11.226130 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx"] Mar 13 01:13:11.230121 master-0 kubenswrapper[4055]: I0313 01:13:11.230099 4055 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:11.240456 master-0 kubenswrapper[4055]: W0313 01:13:11.240409 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61ae6f3_d8eb_4803_a0bf_8aab29c8bd35.slice/crio-e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92 WatchSource:0}: Error finding container e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92: Status 404 returned error can't find the container with id e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92 Mar 13 01:13:11.243457 master-0 kubenswrapper[4055]: I0313 01:13:11.243418 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj"] Mar 13 01:13:11.319657 master-0 kubenswrapper[4055]: I0313 01:13:11.319272 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2"] Mar 13 01:13:11.349754 master-0 kubenswrapper[4055]: I0313 01:13:11.349708 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf"] Mar 13 01:13:11.356275 master-0 kubenswrapper[4055]: W0313 01:13:11.355618 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56480e0_0885_41e5_a1fc_931a068fbadb.slice/crio-6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70 WatchSource:0}: Error finding container 6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70: Status 404 returned error can't find the container with id 6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70 Mar 13 01:13:11.357912 master-0 kubenswrapper[4055]: E0313 01:13:11.357856 4055 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:openshift-api,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43,Command:[write-available-featuresets --asset-output-dir=/available-featuregates --payload-version=$(OPERATOR_IMAGE_VERSION)],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fppkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-64488f9d78-bqmmf_openshift-config-operator(d56480e0-0885-41e5-a1fc-931a068fbadb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 01:13:11.359024 master-0 kubenswrapper[4055]: E0313 01:13:11.358982 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" Mar 13 01:13:11.421731 master-0 kubenswrapper[4055]: I0313 01:13:11.421177 4055 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm"] Mar 13 01:13:11.428777 master-0 kubenswrapper[4055]: W0313 01:13:11.428733 4055 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae44526f_5858_42a0_ba77_3a22f171456f.slice/crio-b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f WatchSource:0}: Error finding container b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f: Status 404 returned error can't find the container with id b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f Mar 13 01:13:12.037418 master-0 kubenswrapper[4055]: I0313 01:13:12.037359 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.037527 4055 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.037598 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.037625 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.037602724 +0000 UTC m=+132.200661862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.037667 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.037781 4055 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.037806 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.037834 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.03781834 +0000 UTC m=+132.200877378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.037896 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.037953 4055 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.038006 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.037991665 +0000 UTC m=+132.201050783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.038007 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.037961 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.038034 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.038027376 +0000 UTC m=+132.201086414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.038056 4055 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: E0313 01:13:12.038071 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:12.038234 master-0 kubenswrapper[4055]: I0313 01:13:12.038099 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038105 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.038095608 +0000 UTC m=+132.201154746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038151 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.038140729 +0000 UTC m=+132.201199877 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: I0313 01:13:12.038167 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038220 4055 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038262 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.038242082 +0000 UTC m=+132.201301130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038273 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:12.038648 master-0 kubenswrapper[4055]: E0313 01:13:12.038302 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.038294913 +0000 UTC m=+132.201354061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:12.138783 master-0 kubenswrapper[4055]: I0313 01:13:12.138686 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:12.138783 master-0 kubenswrapper[4055]: I0313 01:13:12.138736 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:12.138783 master-0 kubenswrapper[4055]: I0313 01:13:12.138767 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: I0313 01:13:12.138836 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.138856 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.138894 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.138882089 +0000 UTC m=+132.301941117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.138948 4055 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: I0313 01:13:12.138979 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.138994 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.138977391 +0000 UTC m=+132.302036429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: I0313 01:13:12.139015 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.139020 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.139067 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:12.139066 master-0 kubenswrapper[4055]: E0313 01:13:12.139079 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.139072394 +0000 UTC m=+132.302131432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:12.139328 master-0 kubenswrapper[4055]: E0313 01:13:12.139093 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.139086914 +0000 UTC m=+132.302145952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:12.139328 master-0 kubenswrapper[4055]: E0313 01:13:12.139135 4055 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:12.139328 master-0 kubenswrapper[4055]: E0313 01:13:12.139155 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.139149996 +0000 UTC m=+132.302209034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:12.139328 master-0 kubenswrapper[4055]: E0313 01:13:12.139046 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:12.139328 master-0 kubenswrapper[4055]: E0313 01:13:12.139178 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:14.139174277 +0000 UTC m=+132.302233305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:12.198552 master-0 kubenswrapper[4055]: I0313 01:13:12.198487 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" event={"ID":"ae44526f-5858-42a0-ba77-3a22f171456f","Type":"ContainerStarted","Data":"b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f"} Mar 13 01:13:12.199494 master-0 kubenswrapper[4055]: I0313 01:13:12.199442 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerStarted","Data":"26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81"} Mar 13 01:13:12.200344 master-0 kubenswrapper[4055]: I0313 01:13:12.200294 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerStarted","Data":"4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14"} Mar 13 01:13:12.202683 master-0 kubenswrapper[4055]: I0313 01:13:12.202005 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70"} Mar 13 01:13:12.204366 master-0 kubenswrapper[4055]: E0313 01:13:12.204340 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" Mar 13 01:13:12.205515 master-0 kubenswrapper[4055]: I0313 01:13:12.205416 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerStarted","Data":"2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663"} Mar 13 01:13:12.205722 master-0 kubenswrapper[4055]: I0313 01:13:12.205578 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerStarted","Data":"e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92"} Mar 13 01:13:12.206328 master-0 kubenswrapper[4055]: I0313 01:13:12.206296 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerStarted","Data":"cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4"} Mar 13 01:13:12.207547 master-0 kubenswrapper[4055]: I0313 01:13:12.207514 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerStarted","Data":"d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848"} Mar 13 01:13:12.209048 master-0 kubenswrapper[4055]: I0313 01:13:12.209009 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64"} Mar 13 01:13:12.242323 master-0 kubenswrapper[4055]: I0313 01:13:12.241396 4055 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" podStartSLOduration=94.241381307 podStartE2EDuration="1m34.241381307s" podCreationTimestamp="2026-03-13 01:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:13:12.240373649 +0000 UTC m=+130.403432687" watchObservedRunningTime="2026-03-13 01:13:12.241381307 +0000 UTC m=+130.404440345" Mar 13 01:13:13.219763 master-0 kubenswrapper[4055]: E0313 01:13:13.218862 4055 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" Mar 13 01:13:13.622455 master-0 kubenswrapper[4055]: I0313 01:13:13.622332 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:13.622617 master-0 kubenswrapper[4055]: E0313 01:13:13.622482 4055 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:13.622617 master-0 kubenswrapper[4055]: E0313 01:13:13.622574 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:17.622550794 +0000 UTC m=+195.785609862 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:14.127101 master-0 kubenswrapper[4055]: I0313 01:13:14.126896 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:14.127101 master-0 kubenswrapper[4055]: I0313 01:13:14.126957 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:14.127101 master-0 kubenswrapper[4055]: I0313 01:13:14.126985 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:14.127101 master-0 kubenswrapper[4055]: I0313 01:13:14.127047 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:14.127682 master-0 kubenswrapper[4055]: I0313 01:13:14.127656 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:14.127771 master-0 kubenswrapper[4055]: I0313 01:13:14.127692 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:14.127771 master-0 kubenswrapper[4055]: I0313 01:13:14.127719 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:14.127771 master-0 kubenswrapper[4055]: I0313 01:13:14.127746 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:14.127924 master-0 kubenswrapper[4055]: E0313 01:13:14.127870 4055 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:14.127960 master-0 kubenswrapper[4055]: E0313 01:13:14.127926 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.127909179 +0000 UTC m=+136.290968217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:14.130336 master-0 kubenswrapper[4055]: E0313 01:13:14.130268 4055 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:14.130428 master-0 kubenswrapper[4055]: E0313 01:13:14.130380 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.130356607 +0000 UTC m=+136.293415675 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:14.130467 master-0 kubenswrapper[4055]: E0313 01:13:14.130455 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:14.130532 master-0 kubenswrapper[4055]: E0313 01:13:14.130484 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.13047507 +0000 UTC m=+136.293534188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:14.130588 master-0 kubenswrapper[4055]: E0313 01:13:14.130534 4055 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:14.130588 master-0 kubenswrapper[4055]: E0313 01:13:14.130561 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.130553973 +0000 UTC m=+136.293613101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:14.130694 master-0 kubenswrapper[4055]: E0313 01:13:14.130606 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:14.130694 master-0 kubenswrapper[4055]: E0313 01:13:14.130649 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.130624175 +0000 UTC m=+136.293683213 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:14.134300 master-0 kubenswrapper[4055]: E0313 01:13:14.134270 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:14.134419 master-0 kubenswrapper[4055]: E0313 01:13:14.134336 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.134321758 +0000 UTC m=+136.297380856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:14.134419 master-0 kubenswrapper[4055]: E0313 01:13:14.134403 4055 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:14.134513 master-0 kubenswrapper[4055]: E0313 01:13:14.134433 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.134422641 +0000 UTC m=+136.297481759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:14.134513 master-0 kubenswrapper[4055]: E0313 01:13:14.134486 4055 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:14.134513 master-0 kubenswrapper[4055]: E0313 01:13:14.134512 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.134502623 +0000 UTC m=+136.297561731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:14.229022 master-0 kubenswrapper[4055]: I0313 01:13:14.228974 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:14.229022 master-0 kubenswrapper[4055]: I0313 01:13:14.229013 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: I0313 01:13:14.229046 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: I0313 01:13:14.229082 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: I0313 01:13:14.229105 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: I0313 01:13:14.229122 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229236 4055 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229278 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229266035 +0000 UTC m=+136.392325073 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229576 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229598 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229590995 +0000 UTC m=+136.392650033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229643 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229674 4055 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229678 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229729 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229704528 +0000 UTC m=+136.392763636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229749 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229741199 +0000 UTC m=+136.392800327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229786 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:14.230111 master-0 kubenswrapper[4055]: E0313 01:13:14.229851 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229838091 +0000 UTC m=+136.392897199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:14.230971 master-0 kubenswrapper[4055]: E0313 01:13:14.229932 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:18.229918264 +0000 UTC m=+136.392977392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:16.229973 master-0 kubenswrapper[4055]: I0313 01:13:16.229925 4055 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="e25bc60853f66a5d6c7e1021efdd8403103d53c529624ce5e308b8d3dfb44aaf" exitCode=0 Mar 13 01:13:16.229973 master-0 kubenswrapper[4055]: I0313 01:13:16.229973 4055 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"e25bc60853f66a5d6c7e1021efdd8403103d53c529624ce5e308b8d3dfb44aaf"} Mar 13 01:13:18.144234 master-0 kubenswrapper[4055]: I0313 01:13:18.143767 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144245 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144294 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144339 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144391 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144428 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144464 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: I0313 01:13:18.144560 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.144024 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.144849 4055 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.144897 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.144866624 +0000 UTC m=+144.307925702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.144927 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.144913506 +0000 UTC m=+144.307972584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.144965 4055 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.145016 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.144994058 +0000 UTC m=+144.308053126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.145026 4055 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:18.145376 master-0 kubenswrapper[4055]: E0313 01:13:18.145090 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145131 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.145101371 +0000 UTC m=+144.308160449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145162 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.145147532 +0000 UTC m=+144.308206610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145172 4055 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145209 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.145195483 +0000 UTC m=+144.308254561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.144955 4055 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145233 4055 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145253 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.145242015 +0000 UTC m=+144.308301093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:18.146200 master-0 kubenswrapper[4055]: E0313 01:13:18.145277 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.145264015 +0000 UTC m=+144.308323083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:18.245655 master-0 kubenswrapper[4055]: I0313 01:13:18.245570 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:18.245820 master-0 kubenswrapper[4055]: I0313 01:13:18.245659 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:18.245820 master-0 kubenswrapper[4055]: I0313 01:13:18.245719 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:18.245930 master-0 kubenswrapper[4055]: E0313 01:13:18.245868 4055 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:18.245930 master-0 kubenswrapper[4055]: E0313 01:13:18.245873 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:18.245930 master-0 kubenswrapper[4055]: E0313 01:13:18.245932 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.245909915 +0000 UTC m=+144.408968983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:18.246097 master-0 kubenswrapper[4055]: E0313 01:13:18.245975 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.245950036 +0000 UTC m=+144.409009104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:18.246097 master-0 kubenswrapper[4055]: I0313 01:13:18.246068 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:18.246208 master-0 kubenswrapper[4055]: I0313 01:13:18.246128 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:18.246208 master-0 kubenswrapper[4055]: I0313 01:13:18.246177 4055 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:18.246396 master-0 kubenswrapper[4055]: E0313 01:13:18.246349 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:18.246463 master-0 kubenswrapper[4055]: E0313 01:13:18.246405 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.246391328 +0000 UTC m=+144.409450396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:18.246525 master-0 kubenswrapper[4055]: E0313 01:13:18.246477 4055 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:18.246579 master-0 kubenswrapper[4055]: E0313 01:13:18.246530 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.246513842 +0000 UTC m=+144.409572920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:18.246684 master-0 kubenswrapper[4055]: E0313 01:13:18.246576 4055 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:18.246684 master-0 kubenswrapper[4055]: E0313 01:13:18.246618 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.246605564 +0000 UTC m=+144.409664642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:18.246794 master-0 kubenswrapper[4055]: E0313 01:13:18.246707 4055 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:18.246794 master-0 kubenswrapper[4055]: E0313 01:13:18.246766 4055 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:26.246751668 +0000 UTC m=+144.409810746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:18.599421 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 13 01:13:18.635484 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 01:13:18.635801 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 13 01:13:18.638236 master-0 systemd[1]: kubelet.service: Consumed 10.880s CPU time. Mar 13 01:13:18.653026 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 01:13:18.765737 master-0 kubenswrapper[7110]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:13:18.766959 master-0 kubenswrapper[7110]: I0313 01:13:18.765865 7110 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 01:13:18.769731 master-0 kubenswrapper[7110]: W0313 01:13:18.769697 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:13:18.769731 master-0 kubenswrapper[7110]: W0313 01:13:18.769719 7110 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:13:18.769731 master-0 kubenswrapper[7110]: W0313 01:13:18.769727 7110 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:13:18.769731 master-0 kubenswrapper[7110]: W0313 01:13:18.769736 7110 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769743 7110 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769751 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769758 7110 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769766 7110 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769774 7110 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769781 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769794 7110 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769801 7110 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769809 7110 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769816 7110 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769823 7110 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769829 7110 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769835 7110 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769842 7110 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769849 7110 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769856 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769862 7110 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769869 7110 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769875 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:13:18.769969 master-0 kubenswrapper[7110]: W0313 01:13:18.769882 7110 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769889 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769896 7110 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769903 7110 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769909 7110 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769916 7110 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769923 7110 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769929 7110 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769936 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769943 7110 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769951 7110 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769958 7110 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769965 7110 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769972 7110 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769979 7110 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769986 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769992 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.769999 7110 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.770005 7110 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.770012 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:13:18.770919 master-0 kubenswrapper[7110]: W0313 01:13:18.770020 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770026 7110 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770034 7110 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770040 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770047 7110 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770054 7110 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770061 7110 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770068 7110 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770075 7110 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770081 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770091 7110 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770100 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770108 7110 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770115 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770126 7110 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770133 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770140 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770152 7110 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770161 7110 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:13:18.771874 master-0 kubenswrapper[7110]: W0313 01:13:18.770169 7110 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770176 7110 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770185 7110 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770194 7110 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770203 7110 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770210 7110 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770218 7110 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770225 7110 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770233 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: W0313 01:13:18.770241 7110 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770411 7110 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770429 7110 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770443 7110 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770454 7110 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770463 7110 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770471 7110 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770482 7110 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770497 7110 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770506 7110 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770515 7110 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 01:13:18.772819 master-0 kubenswrapper[7110]: I0313 01:13:18.770523 7110 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770531 7110 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770539 7110 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770547 7110 flags.go:64] FLAG: --cgroup-root="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770554 7110 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770563 7110 flags.go:64] FLAG: --client-ca-file="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770570 7110 flags.go:64] FLAG: --cloud-config="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770579 7110 flags.go:64] FLAG: --cloud-provider="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770586 7110 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770596 7110 flags.go:64] FLAG: --cluster-domain="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770604 7110 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770612 7110 flags.go:64] FLAG: --config-dir="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770619 7110 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770628 7110 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770666 7110 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770675 7110 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770683 7110 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770692 7110 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770700 7110 flags.go:64] FLAG: --contention-profiling="false" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770708 7110 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770717 7110 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770725 7110 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770733 7110 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770742 7110 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770751 7110 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 01:13:18.773595 master-0 kubenswrapper[7110]: I0313 01:13:18.770760 7110 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770768 7110 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770776 7110 flags.go:64] FLAG: --enable-server="true" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770783 7110 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770804 7110 flags.go:64] FLAG: --event-burst="100" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770812 7110 flags.go:64] FLAG: --event-qps="50" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770822 7110 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770830 7110 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770839 7110 flags.go:64] FLAG: --eviction-hard="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770855 7110 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770863 7110 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770872 7110 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770880 7110 flags.go:64] FLAG: --eviction-soft="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770887 7110 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770895 7110 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770903 7110 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770911 7110 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770919 7110 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770926 7110 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770934 7110 flags.go:64] FLAG: --feature-gates="" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770943 7110 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770951 7110 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770960 7110 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770967 7110 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770975 7110 flags.go:64] FLAG: --healthz-port="10248" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770983 7110 flags.go:64] FLAG: --help="false" Mar 13 01:13:18.774787 master-0 kubenswrapper[7110]: I0313 01:13:18.770991 7110 flags.go:64] FLAG: --hostname-override="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771000 7110 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771008 7110 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771016 7110 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771024 7110 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771032 7110 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771040 7110 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771048 7110 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771056 7110 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771063 7110 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771071 7110 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771080 7110 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771087 7110 flags.go:64] FLAG: --kube-reserved="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771095 7110 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771103 7110 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771111 7110 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771120 7110 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771127 7110 flags.go:64] FLAG: --lock-file="" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771136 7110 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771144 7110 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771153 7110 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771165 7110 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771174 7110 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771182 7110 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771189 7110 flags.go:64] FLAG: --logging-format="text" Mar 13 01:13:18.775589 master-0 kubenswrapper[7110]: I0313 01:13:18.771197 7110 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771205 7110 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771213 7110 flags.go:64] FLAG: --manifest-url="" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771221 7110 flags.go:64] FLAG: --manifest-url-header="" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771231 7110 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771238 7110 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771248 7110 flags.go:64] FLAG: --max-pods="110" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771256 7110 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771265 7110 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771274 7110 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771282 7110 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771290 7110 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771298 7110 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771306 7110 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771324 7110 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771331 7110 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771340 7110 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771348 7110 flags.go:64] FLAG: --pod-cidr="" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771355 7110 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771367 7110 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771374 7110 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771382 7110 flags.go:64] FLAG: --pods-per-core="0" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771390 7110 flags.go:64] FLAG: --port="10250" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771398 7110 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 01:13:18.776477 master-0 kubenswrapper[7110]: I0313 01:13:18.771407 7110 flags.go:64] FLAG: --provider-id="" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771414 7110 flags.go:64] FLAG: --qos-reserved="" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771422 7110 flags.go:64] FLAG: --read-only-port="10255" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771431 7110 flags.go:64] FLAG: --register-node="true" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771438 7110 flags.go:64] FLAG: --register-schedulable="true" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771453 7110 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771467 7110 flags.go:64] FLAG: --registry-burst="10" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771475 7110 flags.go:64] FLAG: --registry-qps="5" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771483 7110 flags.go:64] FLAG: --reserved-cpus="" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771490 7110 flags.go:64] FLAG: --reserved-memory="" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771499 7110 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771508 7110 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771516 7110 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771524 7110 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771532 7110 flags.go:64] FLAG: --runonce="false" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771539 7110 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771547 7110 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771555 7110 flags.go:64] FLAG: --seccomp-default="false" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771568 7110 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771576 7110 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771584 7110 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771592 7110 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771600 7110 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771608 7110 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771616 7110 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 01:13:18.777304 master-0 kubenswrapper[7110]: I0313 01:13:18.771624 7110 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771657 7110 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771667 7110 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771675 7110 flags.go:64] FLAG: --system-cgroups="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771685 7110 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771699 7110 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771706 7110 flags.go:64] FLAG: --tls-cert-file="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771714 7110 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771724 7110 flags.go:64] FLAG: --tls-min-version="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771732 7110 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771740 7110 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771748 7110 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771756 7110 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771764 7110 flags.go:64] FLAG: --v="2" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771774 7110 flags.go:64] FLAG: --version="false" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771783 7110 flags.go:64] FLAG: --vmodule="" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771794 7110 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: I0313 01:13:18.771803 7110 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.771983 7110 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.771994 7110 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.772001 7110 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.772009 7110 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.772019 7110 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:13:18.778277 master-0 kubenswrapper[7110]: W0313 01:13:18.772028 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772035 7110 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772044 7110 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772052 7110 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772059 7110 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772066 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772075 7110 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772083 7110 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772090 7110 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772098 7110 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772106 7110 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772114 7110 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772120 7110 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772127 7110 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772134 7110 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772141 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772147 7110 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772154 7110 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772161 7110 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772168 7110 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:13:18.779040 master-0 kubenswrapper[7110]: W0313 01:13:18.772176 7110 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772184 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772191 7110 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772201 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772208 7110 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772215 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772222 7110 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772229 7110 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772236 7110 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772244 7110 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772251 7110 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772258 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772264 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772271 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772279 7110 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772285 7110 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772292 7110 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772298 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772305 7110 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:13:18.779692 master-0 kubenswrapper[7110]: W0313 01:13:18.772312 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772318 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772325 7110 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772332 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772339 7110 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772346 7110 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772353 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772359 7110 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772366 7110 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772373 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772382 7110 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772391 7110 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772399 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772407 7110 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772415 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772422 7110 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772430 7110 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772436 7110 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772443 7110 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772452 7110 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:13:18.780292 master-0 kubenswrapper[7110]: W0313 01:13:18.772459 7110 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772467 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772475 7110 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772504 7110 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772510 7110 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772518 7110 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772528 7110 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: W0313 01:13:18.772535 7110 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:13:18.780978 master-0 kubenswrapper[7110]: I0313 01:13:18.772559 7110 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:13:18.782499 master-0 kubenswrapper[7110]: I0313 01:13:18.782452 7110 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 01:13:18.782499 master-0 kubenswrapper[7110]: I0313 01:13:18.782488 7110 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782566 7110 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782574 7110 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782578 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782583 7110 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782587 7110 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782591 7110 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782596 7110 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782602 7110 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782606 7110 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782610 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782613 7110 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782617 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782621 7110 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782624 7110 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782643 7110 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782648 7110 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782652 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782658 7110 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782663 7110 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:13:18.782707 master-0 kubenswrapper[7110]: W0313 01:13:18.782668 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782673 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782677 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782681 7110 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782685 7110 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782690 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782699 7110 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782704 7110 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782708 7110 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782714 7110 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782723 7110 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782729 7110 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782735 7110 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782740 7110 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782746 7110 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782752 7110 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782756 7110 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782760 7110 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782764 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:13:18.783366 master-0 kubenswrapper[7110]: W0313 01:13:18.782768 7110 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782771 7110 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782775 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782778 7110 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782782 7110 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782786 7110 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782790 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782794 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782797 7110 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782801 7110 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782806 7110 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782814 7110 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782819 7110 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782823 7110 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782827 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782832 7110 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782836 7110 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782840 7110 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782844 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782848 7110 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:13:18.784077 master-0 kubenswrapper[7110]: W0313 01:13:18.782852 7110 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782856 7110 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782860 7110 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782864 7110 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782867 7110 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782872 7110 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782876 7110 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782880 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782884 7110 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782888 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782892 7110 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782895 7110 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782899 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.782903 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: I0313 01:13:18.782909 7110 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:13:18.784769 master-0 kubenswrapper[7110]: W0313 01:13:18.783038 7110 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783045 7110 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783050 7110 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783054 7110 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783057 7110 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783063 7110 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783068 7110 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783073 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783076 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783081 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783086 7110 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783091 7110 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783095 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783099 7110 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783104 7110 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783109 7110 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783113 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783117 7110 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783121 7110 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:13:18.785248 master-0 kubenswrapper[7110]: W0313 01:13:18.783125 7110 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783129 7110 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783133 7110 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783137 7110 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783142 7110 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783147 7110 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783151 7110 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783155 7110 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783159 7110 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783163 7110 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783166 7110 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783170 7110 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783174 7110 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783177 7110 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783181 7110 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783184 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783188 7110 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783191 7110 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783195 7110 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:13:18.786292 master-0 kubenswrapper[7110]: W0313 01:13:18.783206 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783210 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783214 7110 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783218 7110 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783222 7110 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783225 7110 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783229 7110 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783233 7110 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783236 7110 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783240 7110 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783243 7110 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783247 7110 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783250 7110 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783254 7110 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783258 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783261 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783265 7110 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783268 7110 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783272 7110 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783276 7110 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:13:18.787100 master-0 kubenswrapper[7110]: W0313 01:13:18.783279 7110 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783283 7110 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783287 7110 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783291 7110 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783294 7110 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783299 7110 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783304 7110 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783308 7110 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783312 7110 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783316 7110 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783320 7110 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783323 7110 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783328 7110 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: W0313 01:13:18.783331 7110 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: I0313 01:13:18.783338 7110 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:13:18.787860 master-0 kubenswrapper[7110]: I0313 01:13:18.783508 7110 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785071 7110 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785150 7110 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785337 7110 server.go:997] "Starting client certificate rotation" Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785346 7110 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785575 7110 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 21:17:55.318829601 +0000 UTC Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785712 7110 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h4m36.533122525s for next certificate rotation Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.785976 7110 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:13:18.788386 master-0 kubenswrapper[7110]: I0313 01:13:18.787180 7110 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:13:18.790500 master-0 kubenswrapper[7110]: I0313 01:13:18.790438 7110 log.go:25] "Validated CRI v1 runtime API" Mar 13 01:13:18.792606 master-0 kubenswrapper[7110]: I0313 01:13:18.792563 7110 log.go:25] "Validated CRI v1 image API" Mar 13 01:13:18.793671 master-0 kubenswrapper[7110]: I0313 01:13:18.793563 7110 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 01:13:18.797928 master-0 kubenswrapper[7110]: I0313 01:13:18.797852 7110 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 93b4eca9-1357-4499-ad5f-ae90bf0d6f4a:/dev/vda3] Mar 13 01:13:18.798345 master-0 kubenswrapper[7110]: I0313 01:13:18.797902 7110 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm major:0 minor:289 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm major:0 minor:241 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm major:0 minor:301 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm major:0 minor:130 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm:{mountpoint:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7:{mountpoint:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7 major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb:{mountpoint:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv:{mountpoint:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv major:0 minor:262 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj:{mountpoint:/var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj major:0 minor:110 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bd94289-7109-4419-9a51-bd289082b9f5/volumes/kubernetes.io~projected/kube-api-access-fx5mx:{mountpoint:/var/lib/kubelet/pods/2bd94289-7109-4419-9a51-bd289082b9f5/volumes/kubernetes.io~projected/kube-api-access-fx5mx major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m:{mountpoint:/var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69:{mountpoint:/var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69 major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv:{mountpoint:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48:{mountpoint:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48 major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5:{mountpoint:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv major:0 minor:131 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:128 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s:{mountpoint:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5:{mountpoint:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5 major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76:{mountpoint:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76 major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn:{mountpoint:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9:{mountpoint:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9 major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92:{mountpoint:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92 major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58:{mountpoint:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58 major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6:{mountpoint:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll:{mountpoint:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz:{mountpoint:/var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbf0bd4d-3387-43c3-b9d5-61a044fa2138/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/bbf0bd4d-3387-43c3-b9d5-61a044fa2138/volumes/kubernetes.io~projected/kube-api-access major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn:{mountpoint:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w:{mountpoint:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst:{mountpoint:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf:{mountpoint:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj:{mountpoint:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz:{mountpoint:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg:{mountpoint:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/1068a8316a13c7cdeaaef3545bd8cc14495d985d39d355ee01c171279cd771c1/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/499bfb3a268ed2ae8c38719a6a8a04a9d10a53a80540940251845c54aadba54e/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/782ca9370a5f71c16e4f960a97f1c370b3ad1c37ab963bf8033fa7874b11028b/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/9f53ad407aa6750466461f31dfd38703cc3e2e3edf47253264cf6c9225d8de57/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/bca064206bf5f192337d981eea506766cbc8e42e2303c4baee2649a45d5b3198/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-129:{mountpoint:/var/lib/containers/storage/overlay/ae0170fa71b1afaee67f9931020f7edb95659a89d9f6f5e5f2dbcaa26d36f41e/merged major:0 minor:129 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/b84d62c1501ec0c901404e96a0934bb10d4703eb81b8872aa964453fdfe38e99/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/b6868d91e2f510d0d7d429fc1b6414c2c1cae3f86f79724b39556629095e1b5d/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/68091b7f6a577c845f3d91becae021d32a8c8f85f79e1176a475b8d47520288c/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/7073f091a6c5657ca4314c5a03ae9f229bc11c7e795ba62bbd1029d7f0177318/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/0e6ee2560fdcb0f0c13f465566a4b3c7ed7242b68ea3e9c79cf4eac8f3012cf6/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/58322e6a9597761f1313b9832cb44436e027961135394e245f38a971638e7dff/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/04e134e284ed12dbdbac1541c9a668e3962d3e35a84a5cb2cec4998abd73e774/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/da0ca95ccc186dd2a05024ffa191c14fdb94a60e0125708f1b7fedf07554362c/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/995a36f63b99af5a3873445846c48b423f305b302119aa292503c2e23d7a0928/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/5f2e43e92011b2a5e570ac5378931476d882c3768ba6a6de4b37a8c66412a27e/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/bc0dd8cc88ef64ec8cfe903077dd53528f8aefd18ef9f0a5fcdf15b5c0bcb6d9/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/0771ceed00491afc2390c5cc2d51dd174a9c72143076da9244eadcbe13cfdef3/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/aa4e9169af8e6742d6ba9252cffe67bc4547f68f3573a4d4895973fa6665b1bb/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/b35e881b8ddf864e9c22f1576096b331ebd4d5629c8f323c2675493161365fb6/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/98e53109cbc139608a9c5a6f9a06bff83dc90ac813ca303ed457573bf7e1adda/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/79f5cb6a92a2567d5bf4923e2617ed95203a1bad1b7a88a5289f5a395492492b/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/09d324d7e2c2d3c295ce1871fe4bba404be1551ef1b747b5b529f9f99d911d38/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/4b499bbd05372b43c9eefa7ed027c0793a5d47f8d5b5758276f1d458c7e36123/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/1963df254dacf9fa35ac3229d45db1f915610070a1bffa2757f8f2d97fa00cc0/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/af8da6b97b7c552ae861c8798865c65e6c96376f046e10f9cb574b9c89ff5ae2/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/a15b83df0e9b3f2e68d3868e07dac6db1518deeec3eb99eb50654b620ac07613/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/fe8ed84c5f02692408d6cb49a53386fe55c7b6f10dfb11e4b0b338aa8af48643/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/b0c1b0bc33127a174250fc6d0a25646906422aa82d78711a4ab8451216d9f1ce/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/5558f118d9b3a97727c4938ef2b1cb3d39672616b2980a71c559226d6236cffe/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/3c7c218429efc1a92d2c3854fddfe625b7f654cb9c05a59fab4ca5a2861b9eb7/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/de2e5801a25dafb20b1a9e08d6bc38d94edba2715f174cc379074d4edfffa536/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/bd586057a42d2d6ab3eb10e60ec74387467f74d16578afd7d4a72e2048c7b616/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/e8bbf9f20637f1283dbf78f5cc3cd6f978438c5cc973ecc174ac2569387affdd/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/bdccdfa165a11588a066b17865836563c68838eb6ce5d93123c4118446ace6b1/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/2cf6e6d258541867cac0e52bbadc776d8af7a6b46cb93e15049c6d4885fa5bd0/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-310:{mountpoint:/var/lib/containers/storage/overlay/41eb529aaa94aa0d57132514dd41c12787d769ed21c20891552e34402f7663a9/merged major:0 minor:310 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/031c0a3ed3de2cc591d3584046ecba108369d723483d07d7a7ec8ac77d537e05/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/9fda4e172b1b596523bc6e67a7405be93d7c9033f0cd501d8a68de1a944c7784/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/d0948fc2747bb960ddf3049edc4514597c3a3d74676769833992ae46120c88d6/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/f78f163f1c5f692c7f7cbba02bf5e68bc71718f847e8b3de5b69a635c4662941/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/fb8a48c89262396896d0d1babf7970f8868af7c4799b3c8cbe5e6e8060fbfb76/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/c559c501af202129d56fd89d3508ad5588052e4c6a1bd0350cd5440785466002/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/e555550bdb95c469d2916d8f6625ed06e362d7600d40b3088c8e44f31a6a13cb/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/ea03356ca843e6e502cee78cebc1f0bfd7f6ed43e587887dc717a15fb9a5286c/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/cda92cb4bd7941e2133696d19fdf849d4706b06be458e16ae3fe275ca770ae00/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/ef032f919f026b475546c33b26129bf946b5ea411747afe8aad15587a9b84844/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/39b1dee5bb044ec22ade071e30601662d5a10880bdd25130595567b9f990b3af/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/718e2319fc127a680ba5e9ef27679aa218b3f5610125a2afde58b47741cebf59/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.839552 7110 manager.go:217] Machine: {Timestamp:2026-03-13 01:13:18.838451874 +0000 UTC m=+0.123478380 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d267e942efb840478924173af659d9c8 SystemUUID:d267e942-efb8-4047-8924-173af659d9c8 BootID:beebd46b-80cb-4497-a098-674e9838eb1c Filesystems:[{Device:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-129 DeviceMajor:0 DeviceMinor:129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69 DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5 DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-310 DeviceMajor:0 DeviceMinor:310 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7 DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:128 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48 DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:94 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm DeviceMajor:0 DeviceMinor:241 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj DeviceMajor:0 DeviceMinor:110 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2bd94289-7109-4419-9a51-bd289082b9f5/volumes/kubernetes.io~projected/kube-api-access-fx5mx DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s DeviceMajor:0 DeviceMinor:258 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92 DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58 DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5 DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf DeviceMajor:0 DeviceMinor:263 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbf0bd4d-3387-43c3-b9d5-61a044fa2138/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv DeviceMajor:0 DeviceMinor:131 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm DeviceMajor:0 DeviceMinor:289 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm DeviceMajor:0 DeviceMinor:130 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm DeviceMajor:0 DeviceMinor:301 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv DeviceMajor:0 DeviceMinor:262 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76 DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9 DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:26154ef4eac655e MacAddress:42:9b:05:82:7b:ca Speed:10000 Mtu:8900} {Name:2caee3a1af6093e MacAddress:16:86:a1:25:11:70 Speed:10000 Mtu:8900} {Name:4a78398e61786f9 MacAddress:8a:20:ae:fe:3e:31 Speed:10000 Mtu:8900} {Name:6c434288b6cb191 MacAddress:a2:18:27:87:e3:af Speed:10000 Mtu:8900} {Name:6cd94cfc20909e4 MacAddress:4a:8f:26:15:21:a5 Speed:10000 Mtu:8900} {Name:7d309f2fa26be03 MacAddress:9e:10:0d:8c:a4:95 Speed:10000 Mtu:8900} {Name:7d8cfecb961af50 MacAddress:0e:8a:4a:5d:43:c8 Speed:10000 Mtu:8900} {Name:8f2613fc06a65ee MacAddress:be:16:84:bb:f4:93 Speed:10000 Mtu:8900} {Name:ab94900114a9122 MacAddress:56:61:fc:c4:3e:2c Speed:10000 Mtu:8900} {Name:b53f7152f43c94f MacAddress:da:c1:9e:10:56:38 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:ee:3d:4b:66:b5:5f Speed:0 Mtu:8900} {Name:cf319252fb389a3 MacAddress:a6:f9:b8:c0:f0:85 Speed:10000 Mtu:8900} {Name:d50b6a32815a2be MacAddress:32:7b:ff:ce:42:98 Speed:10000 Mtu:8900} {Name:e4f5650e90a0b9c MacAddress:d6:bb:53:5c:4d:76 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:7e:ba:68 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e0:00:01 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:26:df:a4:ac:f6:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.840330 7110 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.840548 7110 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.840886 7110 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841096 7110 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841134 7110 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841380 7110 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841393 7110 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841404 7110 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841432 7110 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841585 7110 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841696 7110 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841774 7110 kubelet.go:418] "Attempting to sync node with API server" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841789 7110 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841805 7110 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841821 7110 kubelet.go:324] "Adding apiserver pod source" Mar 13 01:13:18.843708 master-0 kubenswrapper[7110]: I0313 01:13:18.841834 7110 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 01:13:18.848905 master-0 kubenswrapper[7110]: I0313 01:13:18.848787 7110 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 01:13:18.849447 master-0 kubenswrapper[7110]: I0313 01:13:18.848999 7110 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 01:13:18.849717 master-0 kubenswrapper[7110]: I0313 01:13:18.849615 7110 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 01:13:18.850680 master-0 kubenswrapper[7110]: I0313 01:13:18.850465 7110 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850714 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850895 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850905 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850918 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850926 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850938 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850946 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 01:13:18.850979 master-0 kubenswrapper[7110]: I0313 01:13:18.850959 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851021 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851045 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851058 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851080 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851115 7110 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 01:13:18.851330 master-0 kubenswrapper[7110]: I0313 01:13:18.851152 7110 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 01:13:18.852979 master-0 kubenswrapper[7110]: I0313 01:13:18.852353 7110 server.go:1280] "Started kubelet" Mar 13 01:13:18.852979 master-0 kubenswrapper[7110]: I0313 01:13:18.852516 7110 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 01:13:18.852979 master-0 kubenswrapper[7110]: I0313 01:13:18.852812 7110 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 01:13:18.852979 master-0 kubenswrapper[7110]: I0313 01:13:18.852984 7110 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 01:13:18.853577 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 01:13:18.855903 master-0 kubenswrapper[7110]: I0313 01:13:18.855800 7110 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 01:13:18.857683 master-0 kubenswrapper[7110]: I0313 01:13:18.856846 7110 server.go:449] "Adding debug handlers to kubelet server" Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.858801 7110 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.858844 7110 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.859046 7110 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 19:11:52.737870649 +0000 UTC Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.859095 7110 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h58m33.878779477s for next certificate rotation Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.860298 7110 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.860313 7110 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 01:13:18.860512 master-0 kubenswrapper[7110]: I0313 01:13:18.860391 7110 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 01:13:18.861018 master-0 kubenswrapper[7110]: I0313 01:13:18.860952 7110 factory.go:55] Registering systemd factory Mar 13 01:13:18.861018 master-0 kubenswrapper[7110]: I0313 01:13:18.860969 7110 factory.go:221] Registration of the systemd container factory successfully Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.861352 7110 factory.go:153] Registering CRI-O factory Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.861386 7110 factory.go:221] Registration of the crio container factory successfully Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.861486 7110 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.861520 7110 factory.go:103] Registering Raw factory Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.861545 7110 manager.go:1196] Started watching for new ooms in manager Mar 13 01:13:18.862754 master-0 kubenswrapper[7110]: I0313 01:13:18.862375 7110 manager.go:319] Starting recovery of all containers Mar 13 01:13:18.864184 master-0 kubenswrapper[7110]: I0313 01:13:18.863353 7110 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865750 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865793 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865803 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d2cd80-23b9-426d-a7ac-1daa27668a47" volumeName="kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865812 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865823 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865832 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46662e51-44af-4732-83a1-9509a579b373" volumeName="kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865841 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865852 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865867 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865876 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865888 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865897 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865907 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865916 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865928 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d2cd80-23b9-426d-a7ac-1daa27668a47" volumeName="kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865936 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865943 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c88187c-d011-4043-a6d3-4a8a7ec4e204" volumeName="kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865951 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e68ab3cb-c372-45d9-a758-beaf4c213714" volumeName="kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865959 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865972 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865979 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.865992 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866001 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866013 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866049 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f35cc1e-3376-4dbd-b215-2a32bf62cc71" volumeName="kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866059 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866070 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866080 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866094 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866107 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bfc49699-9428-4bff-804d-da0e60551759" volumeName="kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866117 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866127 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866137 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866146 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866155 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bfc49699-9428-4bff-804d-da0e60551759" volumeName="kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866164 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866175 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866187 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866196 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71" volumeName="kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866205 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866214 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866223 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866235 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" volumeName="kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866243 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866278 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866286 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866294 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866307 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866315 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866345 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866354 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866363 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866416 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866452 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866461 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866471 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca2fa86b-a966-49dc-8577-d2b54b111d14" volumeName="kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866517 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866539 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866547 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bd94289-7109-4419-9a51-bd289082b9f5" volumeName="kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866555 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866611 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866620 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866731 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866743 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866751 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866766 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866774 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866783 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866812 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866820 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866828 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71" volumeName="kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866837 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" volumeName="kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866850 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866914 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e6150-432c-4a11-b5a6-4d62dd701fc8" volumeName="kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866925 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866934 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca2fa86b-a966-49dc-8577-d2b54b111d14" volumeName="kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866957 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866966 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae44526f-5858-42a0-ba77-3a22f171456f" volumeName="kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866976 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866984 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.866997 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867027 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46662e51-44af-4732-83a1-9509a579b373" volumeName="kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867039 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867049 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867079 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867089 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867097 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867105 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867140 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867161 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13fac7b0-ce55-467d-9d0c-6a122d87cb3c" volumeName="kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867170 7110 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca" seLinuxMountContext="" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867181 7110 reconstruct.go:97] "Volume reconstruction finished" Mar 13 01:13:18.867890 master-0 kubenswrapper[7110]: I0313 01:13:18.867190 7110 reconciler.go:26] "Reconciler: start to sync state" Mar 13 01:13:18.877169 master-0 kubenswrapper[7110]: I0313 01:13:18.877123 7110 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 01:13:18.903744 master-0 kubenswrapper[7110]: I0313 01:13:18.903679 7110 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 01:13:18.905598 master-0 kubenswrapper[7110]: I0313 01:13:18.905568 7110 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 01:13:18.905666 master-0 kubenswrapper[7110]: I0313 01:13:18.905623 7110 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 01:13:18.905723 master-0 kubenswrapper[7110]: I0313 01:13:18.905673 7110 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 01:13:18.905858 master-0 kubenswrapper[7110]: E0313 01:13:18.905760 7110 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 01:13:18.907508 master-0 kubenswrapper[7110]: I0313 01:13:18.907476 7110 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 01:13:18.914997 master-0 kubenswrapper[7110]: I0313 01:13:18.914845 7110 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="e25bc60853f66a5d6c7e1021efdd8403103d53c529624ce5e308b8d3dfb44aaf" exitCode=0 Mar 13 01:13:18.927328 master-0 kubenswrapper[7110]: I0313 01:13:18.927194 7110 generic.go:334] "Generic (PLEG): container finished" podID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerID="7c07bc771c953fa9d34f82960a8b9fd12b63e9a86c930f999ffe77b37e0a74ef" exitCode=0 Mar 13 01:13:18.935481 master-0 kubenswrapper[7110]: I0313 01:13:18.935440 7110 generic.go:334] "Generic (PLEG): container finished" podID="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" containerID="d00fb05f88d59786ab92f821f00f790d94c0eeac3280854affdf40137d7e87d0" exitCode=0 Mar 13 01:13:18.938446 master-0 kubenswrapper[7110]: I0313 01:13:18.938251 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568" exitCode=1 Mar 13 01:13:18.941083 master-0 kubenswrapper[7110]: I0313 01:13:18.941049 7110 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186" exitCode=0 Mar 13 01:13:18.944542 master-0 kubenswrapper[7110]: I0313 01:13:18.944505 7110 generic.go:334] "Generic (PLEG): container finished" podID="d0725849-af6c-4399-9beb-8df68d80963f" containerID="2d1ba7ec4846defd3b04a175ca5a3b9796ffce1a2ede0d1ea47e737fb6974a90" exitCode=0 Mar 13 01:13:18.957351 master-0 kubenswrapper[7110]: I0313 01:13:18.957309 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f64c75ed084248ad496cb98f6981ac7735f162ce7e7121ef5597b4e213d85ac5" exitCode=0 Mar 13 01:13:18.957596 master-0 kubenswrapper[7110]: I0313 01:13:18.957582 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796" exitCode=0 Mar 13 01:13:18.957799 master-0 kubenswrapper[7110]: I0313 01:13:18.957786 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="ac0c969b95b64c22e84de07c2976566813a316f1d691a27df3a1f4621768e238" exitCode=0 Mar 13 01:13:18.957855 master-0 kubenswrapper[7110]: I0313 01:13:18.957845 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="569d90d03ceda29b5f6ff80b99725d90e6a4f9724473ba5d3146ac49efbbe232" exitCode=0 Mar 13 01:13:18.957914 master-0 kubenswrapper[7110]: I0313 01:13:18.957904 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="03b1433799f1c9507de93fbd689d37d0b962300c0b8274b036071bcf3cc09941" exitCode=0 Mar 13 01:13:18.957967 master-0 kubenswrapper[7110]: I0313 01:13:18.957957 7110 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f1844314bd4c14c44c294275c228ee201df2f8be5daa877db9d32b69fb506d82" exitCode=0 Mar 13 01:13:18.965858 master-0 kubenswrapper[7110]: I0313 01:13:18.965815 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 01:13:18.968671 master-0 kubenswrapper[7110]: I0313 01:13:18.968605 7110 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" exitCode=1 Mar 13 01:13:18.968755 master-0 kubenswrapper[7110]: I0313 01:13:18.968670 7110 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4" exitCode=0 Mar 13 01:13:18.998803 master-0 kubenswrapper[7110]: I0313 01:13:18.998759 7110 manager.go:324] Recovery completed Mar 13 01:13:19.005932 master-0 kubenswrapper[7110]: E0313 01:13:19.005893 7110 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 01:13:19.033469 master-0 kubenswrapper[7110]: I0313 01:13:19.033436 7110 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 01:13:19.033469 master-0 kubenswrapper[7110]: I0313 01:13:19.033456 7110 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 01:13:19.033469 master-0 kubenswrapper[7110]: I0313 01:13:19.033472 7110 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:13:19.033848 master-0 kubenswrapper[7110]: I0313 01:13:19.033793 7110 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 01:13:19.033848 master-0 kubenswrapper[7110]: I0313 01:13:19.033803 7110 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 01:13:19.033848 master-0 kubenswrapper[7110]: I0313 01:13:19.033821 7110 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 13 01:13:19.033848 master-0 kubenswrapper[7110]: I0313 01:13:19.033828 7110 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 13 01:13:19.033848 master-0 kubenswrapper[7110]: I0313 01:13:19.033834 7110 policy_none.go:49] "None policy: Start" Mar 13 01:13:19.035395 master-0 kubenswrapper[7110]: I0313 01:13:19.035290 7110 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 01:13:19.035395 master-0 kubenswrapper[7110]: I0313 01:13:19.035397 7110 state_mem.go:35] "Initializing new in-memory state store" Mar 13 01:13:19.035930 master-0 kubenswrapper[7110]: I0313 01:13:19.035801 7110 state_mem.go:75] "Updated machine memory state" Mar 13 01:13:19.035930 master-0 kubenswrapper[7110]: I0313 01:13:19.035815 7110 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 13 01:13:19.044090 master-0 kubenswrapper[7110]: I0313 01:13:19.043857 7110 manager.go:334] "Starting Device Plugin manager" Mar 13 01:13:19.044090 master-0 kubenswrapper[7110]: I0313 01:13:19.043892 7110 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 01:13:19.044090 master-0 kubenswrapper[7110]: I0313 01:13:19.043904 7110 server.go:79] "Starting device plugin registration server" Mar 13 01:13:19.044304 master-0 kubenswrapper[7110]: I0313 01:13:19.044199 7110 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 01:13:19.044304 master-0 kubenswrapper[7110]: I0313 01:13:19.044210 7110 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 01:13:19.044536 master-0 kubenswrapper[7110]: I0313 01:13:19.044510 7110 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 01:13:19.044690 master-0 kubenswrapper[7110]: I0313 01:13:19.044609 7110 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 01:13:19.044690 master-0 kubenswrapper[7110]: I0313 01:13:19.044673 7110 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 01:13:19.144821 master-0 kubenswrapper[7110]: I0313 01:13:19.144655 7110 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:13:19.146413 master-0 kubenswrapper[7110]: I0313 01:13:19.146386 7110 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:13:19.146463 master-0 kubenswrapper[7110]: I0313 01:13:19.146428 7110 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:13:19.146463 master-0 kubenswrapper[7110]: I0313 01:13:19.146441 7110 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:13:19.146537 master-0 kubenswrapper[7110]: I0313 01:13:19.146502 7110 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:13:19.156467 master-0 kubenswrapper[7110]: I0313 01:13:19.156406 7110 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 13 01:13:19.156626 master-0 kubenswrapper[7110]: I0313 01:13:19.156496 7110 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 01:13:19.206647 master-0 kubenswrapper[7110]: I0313 01:13:19.206546 7110 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.206856 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.206916 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207130 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb" Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207176 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207205 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207218 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"5ea6cbe1e49665391bd7e8ee21e44a11b44e5195668056f83fc89f2c9fdb94e3"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207248 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207261 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207273 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207289 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207301 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207316 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207327 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207341 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207356 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38505749c8c0e6dcdac2e9baa8166fa2449cdebc9769720c9163d7ae1a7ea23b" Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207399 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c03513c285f73dedbe67dedcdcf65f9b9e4e6c146f0a64e7433f278ca1844469"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207418 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207431 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4"} Mar 13 01:13:19.207419 master-0 kubenswrapper[7110]: I0313 01:13:19.207445 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e"} Mar 13 01:13:19.221734 master-0 kubenswrapper[7110]: E0313 01:13:19.221673 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.221838 master-0 kubenswrapper[7110]: E0313 01:13:19.221686 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.221960 master-0 kubenswrapper[7110]: E0313 01:13:19.221831 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.221960 master-0 kubenswrapper[7110]: W0313 01:13:19.221910 7110 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 01:13:19.221960 master-0 kubenswrapper[7110]: E0313 01:13:19.221830 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.222060 master-0 kubenswrapper[7110]: E0313 01:13:19.221969 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279292 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279335 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279360 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279376 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279394 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279409 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279423 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.279464 master-0 kubenswrapper[7110]: I0313 01:13:19.279439 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279481 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279520 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279544 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279611 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279688 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279724 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.279753 master-0 kubenswrapper[7110]: I0313 01:13:19.279750 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.280003 master-0 kubenswrapper[7110]: I0313 01:13:19.279770 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.280003 master-0 kubenswrapper[7110]: I0313 01:13:19.279793 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.379968 master-0 kubenswrapper[7110]: I0313 01:13:19.379929 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380098 master-0 kubenswrapper[7110]: I0313 01:13:19.379978 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380098 master-0 kubenswrapper[7110]: I0313 01:13:19.380007 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380098 master-0 kubenswrapper[7110]: I0313 01:13:19.380028 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380194 master-0 kubenswrapper[7110]: I0313 01:13:19.380110 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380299 master-0 kubenswrapper[7110]: I0313 01:13:19.380266 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380333 master-0 kubenswrapper[7110]: I0313 01:13:19.380319 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.380370 master-0 kubenswrapper[7110]: I0313 01:13:19.380340 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.380370 master-0 kubenswrapper[7110]: I0313 01:13:19.380355 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380370 master-0 kubenswrapper[7110]: I0313 01:13:19.380359 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380446 master-0 kubenswrapper[7110]: I0313 01:13:19.380386 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380446 master-0 kubenswrapper[7110]: I0313 01:13:19.380400 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.380446 master-0 kubenswrapper[7110]: I0313 01:13:19.380408 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380446 master-0 kubenswrapper[7110]: I0313 01:13:19.380436 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380547 master-0 kubenswrapper[7110]: I0313 01:13:19.380425 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380547 master-0 kubenswrapper[7110]: I0313 01:13:19.380411 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:19.380547 master-0 kubenswrapper[7110]: I0313 01:13:19.380477 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380547 master-0 kubenswrapper[7110]: I0313 01:13:19.380514 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380547 master-0 kubenswrapper[7110]: I0313 01:13:19.380528 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380555 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380575 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380591 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380594 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380606 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380625 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380668 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.380692 master-0 kubenswrapper[7110]: I0313 01:13:19.380675 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380706 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380787 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380840 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380863 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380865 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:19.380910 master-0 kubenswrapper[7110]: I0313 01:13:19.380891 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.381059 master-0 kubenswrapper[7110]: I0313 01:13:19.380921 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:19.843119 master-0 kubenswrapper[7110]: I0313 01:13:19.843064 7110 apiserver.go:52] "Watching apiserver" Mar 13 01:13:19.855922 master-0 kubenswrapper[7110]: I0313 01:13:19.855887 7110 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 01:13:19.856990 master-0 kubenswrapper[7110]: I0313 01:13:19.856944 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-rvt5h","openshift-ovn-kubernetes/ovnkube-node-v56ct","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn","kube-system/bootstrap-kube-controller-manager-master-0","openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm","openshift-network-operator/iptables-alerter-qclwv","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9","openshift-etcd/etcd-master-0-master-0","openshift-network-diagnostics/network-check-target-xs8pt","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2","openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t","kube-system/bootstrap-kube-scheduler-master-0","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt","openshift-ingress-operator/ingress-operator-677db989d6-kdn2l","openshift-multus/network-metrics-daemon-zh5fh","openshift-network-operator/network-operator-7c649bf6d4-bdc4j","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd","assisted-installer/assisted-installer-controller-qpxft","openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8","openshift-multus/multus-admission-controller-8d675b596-tq7n6","openshift-network-node-identity/network-node-identity-znqwc","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg","openshift-marketplace/marketplace-operator-64bf9778cb-dszg5","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj","openshift-dns-operator/dns-operator-589895fbb7-qvl2k","openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj","openshift-multus/multus-additional-cni-plugins-xn5t5"] Mar 13 01:13:19.857378 master-0 kubenswrapper[7110]: I0313 01:13:19.857340 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:13:19.857418 master-0 kubenswrapper[7110]: I0313 01:13:19.857376 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.858008 master-0 kubenswrapper[7110]: I0313 01:13:19.857988 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.858308 master-0 kubenswrapper[7110]: I0313 01:13:19.858264 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.859691 master-0 kubenswrapper[7110]: I0313 01:13:19.859670 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.859871 master-0 kubenswrapper[7110]: I0313 01:13:19.859853 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:19.859958 master-0 kubenswrapper[7110]: I0313 01:13:19.859940 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.860171 master-0 kubenswrapper[7110]: I0313 01:13:19.860145 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:19.860451 master-0 kubenswrapper[7110]: I0313 01:13:19.860399 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.860625 master-0 kubenswrapper[7110]: I0313 01:13:19.860587 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:19.860801 master-0 kubenswrapper[7110]: I0313 01:13:19.860767 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.861875 master-0 kubenswrapper[7110]: I0313 01:13:19.861840 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:19.862371 master-0 kubenswrapper[7110]: I0313 01:13:19.862345 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:19.862494 master-0 kubenswrapper[7110]: I0313 01:13:19.862464 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:19.862612 master-0 kubenswrapper[7110]: I0313 01:13:19.862580 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.863217 master-0 kubenswrapper[7110]: I0313 01:13:19.863191 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:19.863316 master-0 kubenswrapper[7110]: I0313 01:13:19.863284 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 01:13:19.863607 master-0 kubenswrapper[7110]: I0313 01:13:19.863579 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:13:19.864663 master-0 kubenswrapper[7110]: I0313 01:13:19.863837 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 01:13:19.864663 master-0 kubenswrapper[7110]: I0313 01:13:19.864016 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 01:13:19.869737 master-0 kubenswrapper[7110]: I0313 01:13:19.869697 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 01:13:19.871404 master-0 kubenswrapper[7110]: I0313 01:13:19.871340 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.871546 master-0 kubenswrapper[7110]: I0313 01:13:19.871356 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 01:13:19.872121 master-0 kubenswrapper[7110]: I0313 01:13:19.872085 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 01:13:19.872347 master-0 kubenswrapper[7110]: I0313 01:13:19.872324 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 01:13:19.872400 master-0 kubenswrapper[7110]: I0313 01:13:19.872330 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 01:13:19.872400 master-0 kubenswrapper[7110]: I0313 01:13:19.872395 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 01:13:19.872458 master-0 kubenswrapper[7110]: I0313 01:13:19.872416 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 01:13:19.872485 master-0 kubenswrapper[7110]: I0313 01:13:19.872450 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 01:13:19.872515 master-0 kubenswrapper[7110]: I0313 01:13:19.872509 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 01:13:19.876977 master-0 kubenswrapper[7110]: I0313 01:13:19.876950 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 01:13:19.880471 master-0 kubenswrapper[7110]: I0313 01:13:19.879943 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 01:13:19.880620 master-0 kubenswrapper[7110]: I0313 01:13:19.880481 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:13:19.880620 master-0 kubenswrapper[7110]: I0313 01:13:19.880495 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:13:19.880620 master-0 kubenswrapper[7110]: I0313 01:13:19.880595 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 01:13:19.880750 master-0 kubenswrapper[7110]: I0313 01:13:19.880620 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 01:13:19.880750 master-0 kubenswrapper[7110]: I0313 01:13:19.880604 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 01:13:19.880750 master-0 kubenswrapper[7110]: I0313 01:13:19.880684 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 01:13:19.880874 master-0 kubenswrapper[7110]: I0313 01:13:19.880805 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 01:13:19.880874 master-0 kubenswrapper[7110]: I0313 01:13:19.880819 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 01:13:19.880874 master-0 kubenswrapper[7110]: I0313 01:13:19.880867 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 01:13:19.881147 master-0 kubenswrapper[7110]: I0313 01:13:19.881126 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.881361 master-0 kubenswrapper[7110]: I0313 01:13:19.881333 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 01:13:19.881412 master-0 kubenswrapper[7110]: I0313 01:13:19.881388 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 01:13:19.881450 master-0 kubenswrapper[7110]: I0313 01:13:19.881393 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.881587 master-0 kubenswrapper[7110]: I0313 01:13:19.881555 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 01:13:19.881587 master-0 kubenswrapper[7110]: I0313 01:13:19.881568 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 01:13:19.881690 master-0 kubenswrapper[7110]: I0313 01:13:19.881619 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 01:13:19.881690 master-0 kubenswrapper[7110]: I0313 01:13:19.881625 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 01:13:19.881690 master-0 kubenswrapper[7110]: I0313 01:13:19.881674 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 01:13:19.881905 master-0 kubenswrapper[7110]: I0313 01:13:19.881674 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.881905 master-0 kubenswrapper[7110]: I0313 01:13:19.881715 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.881905 master-0 kubenswrapper[7110]: I0313 01:13:19.881721 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 01:13:19.881905 master-0 kubenswrapper[7110]: I0313 01:13:19.881784 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 01:13:19.882054 master-0 kubenswrapper[7110]: I0313 01:13:19.882030 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 01:13:19.882107 master-0 kubenswrapper[7110]: I0313 01:13:19.882068 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.882255 master-0 kubenswrapper[7110]: I0313 01:13:19.882226 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 01:13:19.882306 master-0 kubenswrapper[7110]: I0313 01:13:19.882265 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.882501 master-0 kubenswrapper[7110]: I0313 01:13:19.882471 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 01:13:19.882538 master-0 kubenswrapper[7110]: I0313 01:13:19.882513 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 01:13:19.882598 master-0 kubenswrapper[7110]: I0313 01:13:19.882472 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.882791 master-0 kubenswrapper[7110]: I0313 01:13:19.882774 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 01:13:19.883018 master-0 kubenswrapper[7110]: I0313 01:13:19.882991 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 01:13:19.883119 master-0 kubenswrapper[7110]: I0313 01:13:19.883022 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 01:13:19.883198 master-0 kubenswrapper[7110]: I0313 01:13:19.883181 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.883257 master-0 kubenswrapper[7110]: I0313 01:13:19.883243 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 01:13:19.883365 master-0 kubenswrapper[7110]: I0313 01:13:19.883294 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 01:13:19.883425 master-0 kubenswrapper[7110]: I0313 01:13:19.883400 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 01:13:19.883456 master-0 kubenswrapper[7110]: I0313 01:13:19.883437 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 01:13:19.883456 master-0 kubenswrapper[7110]: I0313 01:13:19.883457 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 01:13:19.883456 master-0 kubenswrapper[7110]: I0313 01:13:19.883488 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 01:13:19.883456 master-0 kubenswrapper[7110]: I0313 01:13:19.883600 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.883777 master-0 kubenswrapper[7110]: I0313 01:13:19.883670 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 01:13:19.883777 master-0 kubenswrapper[7110]: I0313 01:13:19.883708 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 01:13:19.883777 master-0 kubenswrapper[7110]: I0313 01:13:19.883737 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.883861 master-0 kubenswrapper[7110]: I0313 01:13:19.883790 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:13:19.883861 master-0 kubenswrapper[7110]: I0313 01:13:19.883838 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.883932 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884026 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.883990 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884116 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884187 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884294 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884329 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 01:13:19.884370 master-0 kubenswrapper[7110]: I0313 01:13:19.884332 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884422 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884450 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884503 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884530 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884568 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884554 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884584 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884713 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:13:19.884765 master-0 kubenswrapper[7110]: I0313 01:13:19.884725 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.884869 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.884885 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.884931 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885002 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885013 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885058 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885176 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885174 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.885315 master-0 kubenswrapper[7110]: I0313 01:13:19.885309 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885304 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885545 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885604 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885666 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885674 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885559 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885713 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885907 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885937 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885960 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885981 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.886002 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.886020 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:19.886070 master-0 kubenswrapper[7110]: I0313 01:13:19.885979 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886037 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886123 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886142 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886165 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886185 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886197 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886204 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886250 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886274 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886296 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886319 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.886550 master-0 kubenswrapper[7110]: I0313 01:13:19.886392 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886647 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886669 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886707 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886724 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886798 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886832 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886870 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886904 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886899 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.887019 master-0 kubenswrapper[7110]: I0313 01:13:19.886996 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887000 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887044 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887083 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887151 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887216 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887270 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887322 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887349 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887372 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887423 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887478 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887516 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887524 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887524 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887584 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.887647 master-0 kubenswrapper[7110]: I0313 01:13:19.887616 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887682 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887719 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887766 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887798 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887622 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887834 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887832 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887867 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887684 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.887889 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.888002 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.888022 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.888040 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:19.888123 master-0 kubenswrapper[7110]: I0313 01:13:19.888080 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:19.888460 master-0 kubenswrapper[7110]: I0313 01:13:19.888193 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:19.888460 master-0 kubenswrapper[7110]: I0313 01:13:19.888345 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.888460 master-0 kubenswrapper[7110]: I0313 01:13:19.888362 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.888460 master-0 kubenswrapper[7110]: I0313 01:13:19.888393 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:19.888713 master-0 kubenswrapper[7110]: I0313 01:13:19.888486 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:19.888713 master-0 kubenswrapper[7110]: I0313 01:13:19.888539 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.888713 master-0 kubenswrapper[7110]: I0313 01:13:19.888611 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:19.888713 master-0 kubenswrapper[7110]: I0313 01:13:19.888688 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 01:13:19.888713 master-0 kubenswrapper[7110]: I0313 01:13:19.888709 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888715 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888395 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888805 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888847 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888888 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.888968 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889039 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889049 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889145 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889203 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889258 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889312 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889340 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889365 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889421 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889485 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.889589 master-0 kubenswrapper[7110]: I0313 01:13:19.889593 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889676 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889716 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889729 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889786 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889855 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889800 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.889964 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.890029 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.890092 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.890117 master-0 kubenswrapper[7110]: I0313 01:13:19.890114 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 01:13:19.890392 master-0 kubenswrapper[7110]: I0313 01:13:19.890122 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.890392 master-0 kubenswrapper[7110]: I0313 01:13:19.890145 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:19.890392 master-0 kubenswrapper[7110]: I0313 01:13:19.890202 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.890392 master-0 kubenswrapper[7110]: I0313 01:13:19.890251 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:19.890490 master-0 kubenswrapper[7110]: I0313 01:13:19.890428 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:19.890553 master-0 kubenswrapper[7110]: I0313 01:13:19.890515 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:19.896890 master-0 kubenswrapper[7110]: I0313 01:13:19.896857 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.898822 master-0 kubenswrapper[7110]: I0313 01:13:19.897735 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 01:13:19.904146 master-0 kubenswrapper[7110]: I0313 01:13:19.904102 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 01:13:19.905815 master-0 kubenswrapper[7110]: I0313 01:13:19.905789 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 01:13:19.906011 master-0 kubenswrapper[7110]: I0313 01:13:19.905756 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 01:13:19.908937 master-0 kubenswrapper[7110]: I0313 01:13:19.908890 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:19.910078 master-0 kubenswrapper[7110]: I0313 01:13:19.910017 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.911066 master-0 kubenswrapper[7110]: I0313 01:13:19.910927 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.913911 master-0 kubenswrapper[7110]: I0313 01:13:19.913886 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 01:13:19.925409 master-0 kubenswrapper[7110]: I0313 01:13:19.925279 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 01:13:19.945226 master-0 kubenswrapper[7110]: I0313 01:13:19.945165 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 01:13:19.962370 master-0 kubenswrapper[7110]: I0313 01:13:19.962319 7110 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 01:13:19.964920 master-0 kubenswrapper[7110]: I0313 01:13:19.964896 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 01:13:19.985602 master-0 kubenswrapper[7110]: I0313 01:13:19.985547 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 01:13:19.991829 master-0 kubenswrapper[7110]: E0313 01:13:19.991780 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:19.991992 master-0 kubenswrapper[7110]: I0313 01:13:19.991586 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:19.991992 master-0 kubenswrapper[7110]: E0313 01:13:19.991884 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.491862484 +0000 UTC m=+1.776888960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:19.991992 master-0 kubenswrapper[7110]: I0313 01:13:19.991916 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.992210 master-0 kubenswrapper[7110]: I0313 01:13:19.992049 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.992383 master-0 kubenswrapper[7110]: I0313 01:13:19.992337 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.993215 master-0 kubenswrapper[7110]: I0313 01:13:19.993184 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.993329 master-0 kubenswrapper[7110]: I0313 01:13:19.993259 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.993425 master-0 kubenswrapper[7110]: I0313 01:13:19.993387 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.993538 master-0 kubenswrapper[7110]: I0313 01:13:19.993448 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.993538 master-0 kubenswrapper[7110]: I0313 01:13:19.993478 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:19.993538 master-0 kubenswrapper[7110]: I0313 01:13:19.993500 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.993538 master-0 kubenswrapper[7110]: I0313 01:13:19.993522 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.993810 master-0 kubenswrapper[7110]: I0313 01:13:19.993546 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.993810 master-0 kubenswrapper[7110]: I0313 01:13:19.993571 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:19.993810 master-0 kubenswrapper[7110]: E0313 01:13:19.993755 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:19.993810 master-0 kubenswrapper[7110]: I0313 01:13:19.993791 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.994039 master-0 kubenswrapper[7110]: I0313 01:13:19.993794 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.994039 master-0 kubenswrapper[7110]: E0313 01:13:19.993920 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.49387447 +0000 UTC m=+1.778900946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:19.994039 master-0 kubenswrapper[7110]: I0313 01:13:19.993946 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:19.994039 master-0 kubenswrapper[7110]: I0313 01:13:19.994003 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.994039 master-0 kubenswrapper[7110]: I0313 01:13:19.993949 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.994305 master-0 kubenswrapper[7110]: I0313 01:13:19.994031 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.994305 master-0 kubenswrapper[7110]: I0313 01:13:19.994138 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.994409 master-0 kubenswrapper[7110]: I0313 01:13:19.994322 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:19.994409 master-0 kubenswrapper[7110]: I0313 01:13:19.994355 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.994409 master-0 kubenswrapper[7110]: I0313 01:13:19.994389 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:19.994585 master-0 kubenswrapper[7110]: I0313 01:13:19.994509 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.994585 master-0 kubenswrapper[7110]: I0313 01:13:19.994533 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.994585 master-0 kubenswrapper[7110]: I0313 01:13:19.994550 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.994585 master-0 kubenswrapper[7110]: I0313 01:13:19.994564 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.994585 master-0 kubenswrapper[7110]: I0313 01:13:19.994571 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994661 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: E0313 01:13:19.994673 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: E0313 01:13:19.994709 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.494698253 +0000 UTC m=+1.779724729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994704 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994768 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994701 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994780 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994829 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994838 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.994896 master-0 kubenswrapper[7110]: I0313 01:13:19.994854 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.994953 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995007 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995034 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995083 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995124 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995175 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995197 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:19.995402 master-0 kubenswrapper[7110]: I0313 01:13:19.995248 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995475 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995606 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995656 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995679 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995715 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995737 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995758 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995782 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:19.995867 master-0 kubenswrapper[7110]: I0313 01:13:19.995809 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: E0313 01:13:19.995897 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.995944 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.995962 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: E0313 01:13:19.996000 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.495986759 +0000 UTC m=+1.781013235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996007 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996048 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996121 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996182 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996247 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.996352 master-0 kubenswrapper[7110]: I0313 01:13:19.996349 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996405 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996437 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996471 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: E0313 01:13:19.996604 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: E0313 01:13:19.996705 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.496680139 +0000 UTC m=+1.781706645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996707 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996789 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996835 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996876 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.996916 master-0 kubenswrapper[7110]: I0313 01:13:19.996927 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.996955 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.996979 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997001 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997023 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: E0313 01:13:19.997067 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997088 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997222 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: E0313 01:13:19.997245 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.497223634 +0000 UTC m=+1.782250110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997286 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997390 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997417 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997442 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997484 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997519 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997451 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997549 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997612 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997653 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.997707 master-0 kubenswrapper[7110]: I0313 01:13:19.997677 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.997751 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.997802 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.997837 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.997851 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.997931 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.497884713 +0000 UTC m=+1.782911189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998046 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998195 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998241 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998297 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998338 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998454 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998460 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998503 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.998543 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998567 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.998576 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.998588 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.498575232 +0000 UTC m=+1.783601718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998703 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998751 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.998813 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.498796428 +0000 UTC m=+1.783822904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998844 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.998965 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.999063 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.999086 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: E0313 01:13:19.999107 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.499095596 +0000 UTC m=+1.784122082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.999186 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.999226 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:19.999299 master-0 kubenswrapper[7110]: I0313 01:13:19.999307 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:20.001045 master-0 kubenswrapper[7110]: E0313 01:13:19.999385 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:20.001045 master-0 kubenswrapper[7110]: I0313 01:13:19.999421 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:20.001045 master-0 kubenswrapper[7110]: E0313 01:13:19.999445 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.499421356 +0000 UTC m=+1.784447862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:20.001045 master-0 kubenswrapper[7110]: I0313 01:13:19.999527 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:20.005497 master-0 kubenswrapper[7110]: I0313 01:13:20.005464 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 01:13:20.024809 master-0 kubenswrapper[7110]: I0313 01:13:20.024746 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 01:13:20.045106 master-0 kubenswrapper[7110]: I0313 01:13:20.045071 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 01:13:20.054926 master-0 kubenswrapper[7110]: I0313 01:13:20.054893 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:20.064758 master-0 kubenswrapper[7110]: I0313 01:13:20.064693 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 01:13:20.068969 master-0 kubenswrapper[7110]: I0313 01:13:20.068903 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.085908 master-0 kubenswrapper[7110]: I0313 01:13:20.085869 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 01:13:20.100961 master-0 kubenswrapper[7110]: I0313 01:13:20.100847 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.100961 master-0 kubenswrapper[7110]: I0313 01:13:20.100937 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101109 master-0 kubenswrapper[7110]: I0313 01:13:20.100966 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101109 master-0 kubenswrapper[7110]: I0313 01:13:20.101004 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101109 master-0 kubenswrapper[7110]: I0313 01:13:20.101035 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101109 master-0 kubenswrapper[7110]: I0313 01:13:20.101050 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101109 master-0 kubenswrapper[7110]: I0313 01:13:20.101107 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101311 master-0 kubenswrapper[7110]: I0313 01:13:20.101167 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:20.101311 master-0 kubenswrapper[7110]: I0313 01:13:20.101231 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:20.101311 master-0 kubenswrapper[7110]: I0313 01:13:20.101234 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101311 master-0 kubenswrapper[7110]: I0313 01:13:20.101276 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101445 master-0 kubenswrapper[7110]: I0313 01:13:20.101357 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101489 master-0 kubenswrapper[7110]: E0313 01:13:20.101457 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:20.101527 master-0 kubenswrapper[7110]: I0313 01:13:20.101515 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101569 master-0 kubenswrapper[7110]: I0313 01:13:20.101537 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101608 master-0 kubenswrapper[7110]: I0313 01:13:20.101572 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101608 master-0 kubenswrapper[7110]: E0313 01:13:20.101577 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.601551387 +0000 UTC m=+1.886577863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:20.101736 master-0 kubenswrapper[7110]: I0313 01:13:20.101590 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101736 master-0 kubenswrapper[7110]: I0313 01:13:20.101666 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101736 master-0 kubenswrapper[7110]: I0313 01:13:20.101715 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:20.101736 master-0 kubenswrapper[7110]: I0313 01:13:20.101733 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.101873 master-0 kubenswrapper[7110]: I0313 01:13:20.101767 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101873 master-0 kubenswrapper[7110]: I0313 01:13:20.101799 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101873 master-0 kubenswrapper[7110]: I0313 01:13:20.101816 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.101873 master-0 kubenswrapper[7110]: I0313 01:13:20.101838 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.101873 master-0 kubenswrapper[7110]: I0313 01:13:20.101865 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102156 master-0 kubenswrapper[7110]: I0313 01:13:20.101944 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102370 master-0 kubenswrapper[7110]: I0313 01:13:20.102337 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102437 master-0 kubenswrapper[7110]: I0313 01:13:20.102380 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102437 master-0 kubenswrapper[7110]: I0313 01:13:20.102397 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102437 master-0 kubenswrapper[7110]: I0313 01:13:20.102416 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102437 master-0 kubenswrapper[7110]: I0313 01:13:20.102436 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102457 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102472 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102484 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102500 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102440 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102544 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102574 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102657 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.102704 master-0 kubenswrapper[7110]: I0313 01:13:20.102698 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102720 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102744 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102773 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102777 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102806 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102812 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102825 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102841 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: E0313 01:13:20.102849 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102868 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102871 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102846 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: E0313 01:13:20.102891 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.602875674 +0000 UTC m=+1.887902130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102907 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102911 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102890 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102935 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102957 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.102976 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.103002 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.103019 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.103045 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.103066 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:20.103084 master-0 kubenswrapper[7110]: I0313 01:13:20.103099 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103130 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103150 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: E0313 01:13:20.103154 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: E0313 01:13:20.103202 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.603191343 +0000 UTC m=+1.888217819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103206 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.102915 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103222 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103240 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103253 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103279 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103312 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: E0313 01:13:20.103322 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: E0313 01:13:20.103358 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.603346397 +0000 UTC m=+1.888372863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103280 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103381 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103411 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:20.104042 master-0 kubenswrapper[7110]: I0313 01:13:20.103421 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:20.105133 master-0 kubenswrapper[7110]: I0313 01:13:20.105097 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:13:20.116096 master-0 kubenswrapper[7110]: I0313 01:13:20.116062 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.125739 master-0 kubenswrapper[7110]: I0313 01:13:20.125706 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 01:13:20.145030 master-0 kubenswrapper[7110]: I0313 01:13:20.144972 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:13:20.147734 master-0 kubenswrapper[7110]: I0313 01:13:20.147684 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.165600 master-0 kubenswrapper[7110]: I0313 01:13:20.165540 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 01:13:20.167926 master-0 kubenswrapper[7110]: I0313 01:13:20.167897 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:20.185012 master-0 kubenswrapper[7110]: I0313 01:13:20.184963 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 01:13:20.191743 master-0 kubenswrapper[7110]: E0313 01:13:20.191694 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:20.191850 master-0 kubenswrapper[7110]: E0313 01:13:20.191749 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:20.691733653 +0000 UTC m=+1.976760119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:20.205839 master-0 kubenswrapper[7110]: I0313 01:13:20.205782 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 01:13:20.216357 master-0 kubenswrapper[7110]: I0313 01:13:20.216306 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:20.277105 master-0 kubenswrapper[7110]: I0313 01:13:20.276608 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:20.279268 master-0 kubenswrapper[7110]: I0313 01:13:20.279201 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:20.293077 master-0 kubenswrapper[7110]: I0313 01:13:20.293002 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:20.297755 master-0 kubenswrapper[7110]: I0313 01:13:20.297670 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:13:20.315581 master-0 kubenswrapper[7110]: I0313 01:13:20.315522 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:13:20.346958 master-0 kubenswrapper[7110]: I0313 01:13:20.346900 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:13:20.359363 master-0 kubenswrapper[7110]: I0313 01:13:20.359252 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:13:20.379973 master-0 kubenswrapper[7110]: I0313 01:13:20.379914 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:20.399065 master-0 kubenswrapper[7110]: I0313 01:13:20.398982 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:13:20.424512 master-0 kubenswrapper[7110]: I0313 01:13:20.424464 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:20.440507 master-0 kubenswrapper[7110]: I0313 01:13:20.440468 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:20.460178 master-0 kubenswrapper[7110]: I0313 01:13:20.460146 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:20.462922 master-0 kubenswrapper[7110]: I0313 01:13:20.462266 7110 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:13:20.475407 master-0 kubenswrapper[7110]: I0313 01:13:20.475356 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:20.483025 master-0 kubenswrapper[7110]: I0313 01:13:20.482989 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:20.499678 master-0 kubenswrapper[7110]: I0313 01:13:20.492888 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:13:20.500539 master-0 kubenswrapper[7110]: I0313 01:13:20.500492 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:13:20.520137 master-0 kubenswrapper[7110]: I0313 01:13:20.519977 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:20.520313 master-0 kubenswrapper[7110]: E0313 01:13:20.520085 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:20.520313 master-0 kubenswrapper[7110]: E0313 01:13:20.520240 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.520224527 +0000 UTC m=+2.805250993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:20.520313 master-0 kubenswrapper[7110]: E0313 01:13:20.520269 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:20.520313 master-0 kubenswrapper[7110]: I0313 01:13:20.520187 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:20.520555 master-0 kubenswrapper[7110]: E0313 01:13:20.520333 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.520315 +0000 UTC m=+2.805341506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:20.520555 master-0 kubenswrapper[7110]: I0313 01:13:20.520408 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:20.520555 master-0 kubenswrapper[7110]: I0313 01:13:20.520504 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:20.520555 master-0 kubenswrapper[7110]: I0313 01:13:20.520550 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520660 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520697 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.52068425 +0000 UTC m=+2.805710716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520749 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520767 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.520761272 +0000 UTC m=+2.805787738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520798 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:20.520814 master-0 kubenswrapper[7110]: E0313 01:13:20.520815 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.520809244 +0000 UTC m=+2.805835710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:20.521150 master-0 kubenswrapper[7110]: E0313 01:13:20.520925 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:20.521150 master-0 kubenswrapper[7110]: E0313 01:13:20.520948 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.520942037 +0000 UTC m=+2.805968503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.522539 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.522769 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.522906 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.522969 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.522949734 +0000 UTC m=+2.807976240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.523034 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523135 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.523143 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523163 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.523155279 +0000 UTC m=+2.808181745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523257 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523300 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.523286303 +0000 UTC m=+2.808312799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.523543 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523721 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.523775 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.523760676 +0000 UTC m=+2.808787182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: I0313 01:13:20.523951 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.524056 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:20.524214 master-0 kubenswrapper[7110]: E0313 01:13:20.524103 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.524090685 +0000 UTC m=+2.809117181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:20.529787 master-0 kubenswrapper[7110]: I0313 01:13:20.529714 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:20.546671 master-0 kubenswrapper[7110]: I0313 01:13:20.546604 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:20.558676 master-0 kubenswrapper[7110]: I0313 01:13:20.558601 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:20.597342 master-0 kubenswrapper[7110]: I0313 01:13:20.597245 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:13:20.597926 master-0 kubenswrapper[7110]: I0313 01:13:20.597872 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:20.617520 master-0 kubenswrapper[7110]: I0313 01:13:20.617350 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:20.624699 master-0 kubenswrapper[7110]: I0313 01:13:20.624651 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:20.624837 master-0 kubenswrapper[7110]: I0313 01:13:20.624706 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:20.624837 master-0 kubenswrapper[7110]: E0313 01:13:20.624807 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:20.624952 master-0 kubenswrapper[7110]: E0313 01:13:20.624862 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.624847135 +0000 UTC m=+2.909873611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:20.624952 master-0 kubenswrapper[7110]: I0313 01:13:20.624932 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:20.624952 master-0 kubenswrapper[7110]: E0313 01:13:20.624942 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:20.625132 master-0 kubenswrapper[7110]: E0313 01:13:20.625020 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.624989249 +0000 UTC m=+2.910015755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:20.625132 master-0 kubenswrapper[7110]: E0313 01:13:20.625030 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:20.625132 master-0 kubenswrapper[7110]: E0313 01:13:20.625056 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.625047761 +0000 UTC m=+2.910074247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:20.625132 master-0 kubenswrapper[7110]: I0313 01:13:20.625091 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:20.625388 master-0 kubenswrapper[7110]: E0313 01:13:20.625326 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:20.625496 master-0 kubenswrapper[7110]: E0313 01:13:20.625417 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.625391831 +0000 UTC m=+2.910418297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:20.649356 master-0 kubenswrapper[7110]: I0313 01:13:20.649277 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:20.668478 master-0 kubenswrapper[7110]: I0313 01:13:20.668424 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:13:20.676701 master-0 kubenswrapper[7110]: I0313 01:13:20.676606 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:13:20.697482 master-0 kubenswrapper[7110]: I0313 01:13:20.697407 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:13:20.711701 master-0 kubenswrapper[7110]: E0313 01:13:20.711660 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:13:20.711701 master-0 kubenswrapper[7110]: I0313 01:13:20.711678 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:20.716510 master-0 kubenswrapper[7110]: I0313 01:13:20.716471 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:20.726461 master-0 kubenswrapper[7110]: I0313 01:13:20.726419 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:20.726598 master-0 kubenswrapper[7110]: E0313 01:13:20.726569 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:20.726690 master-0 kubenswrapper[7110]: E0313 01:13:20.726665 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:21.726624134 +0000 UTC m=+3.011650600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:20.735259 master-0 kubenswrapper[7110]: E0313 01:13:20.735222 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:13:20.752416 master-0 kubenswrapper[7110]: E0313 01:13:20.752336 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:20.775839 master-0 kubenswrapper[7110]: W0313 01:13:20.775781 7110 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 01:13:20.776251 master-0 kubenswrapper[7110]: E0313 01:13:20.776206 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:13:20.794786 master-0 kubenswrapper[7110]: E0313 01:13:20.794745 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:13:20.828570 master-0 kubenswrapper[7110]: I0313 01:13:20.828521 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:20.837667 master-0 kubenswrapper[7110]: I0313 01:13:20.837563 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:20.873952 master-0 kubenswrapper[7110]: I0313 01:13:20.873825 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:13:20.885981 master-0 kubenswrapper[7110]: I0313 01:13:20.885948 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:20.901159 master-0 kubenswrapper[7110]: I0313 01:13:20.901098 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:20.920239 master-0 kubenswrapper[7110]: I0313 01:13:20.920179 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:13:20.959620 master-0 kubenswrapper[7110]: I0313 01:13:20.951794 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:13:20.960432 master-0 kubenswrapper[7110]: I0313 01:13:20.960373 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:20.979341 master-0 kubenswrapper[7110]: I0313 01:13:20.979279 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:21.003953 master-0 kubenswrapper[7110]: I0313 01:13:21.003908 7110 request.go:700] Waited for 1.00596546s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/serviceaccounts/ovn-kubernetes-control-plane/token Mar 13 01:13:21.006953 master-0 kubenswrapper[7110]: I0313 01:13:21.006920 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:13:21.019800 master-0 kubenswrapper[7110]: I0313 01:13:21.019760 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:13:21.049114 master-0 kubenswrapper[7110]: I0313 01:13:21.049049 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:13:21.073300 master-0 kubenswrapper[7110]: I0313 01:13:21.070116 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:13:21.091440 master-0 kubenswrapper[7110]: I0313 01:13:21.091392 7110 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 01:13:21.096381 master-0 kubenswrapper[7110]: I0313 01:13:21.096334 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:21.212181 master-0 kubenswrapper[7110]: E0313 01:13:21.212081 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9" Mar 13 01:13:21.212411 master-0 kubenswrapper[7110]: E0313 01:13:21.212309 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jg7x6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-7f65c457f5-v9nfg_openshift-kube-storage-version-migrator-operator(916d9fc9-388b-4506-a17c-36a7f626356a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:21.214377 master-0 kubenswrapper[7110]: E0313 01:13:21.214322 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" podUID="916d9fc9-388b-4506-a17c-36a7f626356a" Mar 13 01:13:21.364706 master-0 kubenswrapper[7110]: I0313 01:13:21.363810 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:21.543351 master-0 kubenswrapper[7110]: I0313 01:13:21.543249 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:21.543351 master-0 kubenswrapper[7110]: I0313 01:13:21.543323 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:21.543351 master-0 kubenswrapper[7110]: I0313 01:13:21.543348 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:21.543560 master-0 kubenswrapper[7110]: E0313 01:13:21.543480 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:21.543560 master-0 kubenswrapper[7110]: E0313 01:13:21.543551 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.543529938 +0000 UTC m=+4.828556424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:21.543625 master-0 kubenswrapper[7110]: E0313 01:13:21.543568 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:21.543625 master-0 kubenswrapper[7110]: E0313 01:13:21.543620 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.54360381 +0000 UTC m=+4.828630356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:21.543701 master-0 kubenswrapper[7110]: E0313 01:13:21.543680 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:21.543729 master-0 kubenswrapper[7110]: E0313 01:13:21.543703 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.543694913 +0000 UTC m=+4.828721379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: E0313 01:13:21.543734 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: E0313 01:13:21.543752 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.543747294 +0000 UTC m=+4.828773760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: I0313 01:13:21.543494 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: I0313 01:13:21.543774 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: I0313 01:13:21.543792 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:21.543819 master-0 kubenswrapper[7110]: I0313 01:13:21.543809 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:21.543967 master-0 kubenswrapper[7110]: I0313 01:13:21.543827 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:21.543967 master-0 kubenswrapper[7110]: I0313 01:13:21.543851 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:21.543967 master-0 kubenswrapper[7110]: E0313 01:13:21.543929 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:21.543967 master-0 kubenswrapper[7110]: E0313 01:13:21.543963 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.5439527 +0000 UTC m=+4.828979176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:21.544066 master-0 kubenswrapper[7110]: E0313 01:13:21.543979 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:21.544066 master-0 kubenswrapper[7110]: E0313 01:13:21.544003 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.543996581 +0000 UTC m=+4.829023047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:21.544066 master-0 kubenswrapper[7110]: E0313 01:13:21.544038 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:21.544066 master-0 kubenswrapper[7110]: E0313 01:13:21.544053 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.544048632 +0000 UTC m=+4.829075098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:21.544167 master-0 kubenswrapper[7110]: E0313 01:13:21.544101 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:21.544167 master-0 kubenswrapper[7110]: E0313 01:13:21.544139 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.544128195 +0000 UTC m=+4.829154671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:21.544167 master-0 kubenswrapper[7110]: E0313 01:13:21.544158 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:21.544247 master-0 kubenswrapper[7110]: E0313 01:13:21.544181 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.544174226 +0000 UTC m=+4.829200812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:21.544247 master-0 kubenswrapper[7110]: I0313 01:13:21.544200 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:21.544247 master-0 kubenswrapper[7110]: I0313 01:13:21.544229 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:21.544376 master-0 kubenswrapper[7110]: E0313 01:13:21.544358 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:21.544412 master-0 kubenswrapper[7110]: E0313 01:13:21.544359 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:21.544412 master-0 kubenswrapper[7110]: E0313 01:13:21.544389 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.544382652 +0000 UTC m=+4.829409118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:21.545021 master-0 kubenswrapper[7110]: E0313 01:13:21.545001 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.544552937 +0000 UTC m=+4.829579423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:21.644929 master-0 kubenswrapper[7110]: I0313 01:13:21.644862 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:21.644929 master-0 kubenswrapper[7110]: I0313 01:13:21.644924 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:21.645205 master-0 kubenswrapper[7110]: I0313 01:13:21.644973 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:21.645205 master-0 kubenswrapper[7110]: I0313 01:13:21.645056 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:21.645283 master-0 kubenswrapper[7110]: E0313 01:13:21.645213 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:21.645283 master-0 kubenswrapper[7110]: E0313 01:13:21.645260 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.645245275 +0000 UTC m=+4.930271741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:21.645364 master-0 kubenswrapper[7110]: E0313 01:13:21.645307 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:21.645364 master-0 kubenswrapper[7110]: E0313 01:13:21.645331 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.645322897 +0000 UTC m=+4.930349363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:21.645444 master-0 kubenswrapper[7110]: E0313 01:13:21.645374 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:21.645444 master-0 kubenswrapper[7110]: E0313 01:13:21.645398 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.645391119 +0000 UTC m=+4.930417585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:21.645444 master-0 kubenswrapper[7110]: E0313 01:13:21.645441 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:21.645562 master-0 kubenswrapper[7110]: E0313 01:13:21.645462 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.645455491 +0000 UTC m=+4.930481967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:21.745693 master-0 kubenswrapper[7110]: I0313 01:13:21.745646 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:21.745929 master-0 kubenswrapper[7110]: E0313 01:13:21.745889 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:21.746009 master-0 kubenswrapper[7110]: E0313 01:13:21.745988 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:23.745962934 +0000 UTC m=+5.030989460 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:21.835959 master-0 kubenswrapper[7110]: E0313 01:13:21.835855 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba" Mar 13 01:13:21.836110 master-0 kubenswrapper[7110]: E0313 01:13:21.836011 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nhcll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-69b6fc6b88-2v42g_openshift-service-ca-operator(95d4e785-6663-417d-b380-6905773613c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:21.838186 master-0 kubenswrapper[7110]: E0313 01:13:21.838093 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" podUID="95d4e785-6663-417d-b380-6905773613c8" Mar 13 01:13:22.469873 master-0 kubenswrapper[7110]: E0313 01:13:22.469808 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5" Mar 13 01:13:22.470498 master-0 kubenswrapper[7110]: E0313 01:13:22.470148 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cluster-storage-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5,Command:[cluster-storage-operator start],Args:[start -v=2 --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d56c7f1d581c34f2ea2f67902eb71069e48523bdb5d964a9f6f63aa99f968876,ValueFrom:nil,},EnvVar{Name:AWS_EBS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06ef6dcf5e3f0f83ed90209f5d3b31dab1debd1049ec97ec92f4f800abea8b78,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3bf7ebb7d731da7f28d37117f4a38c9ee300ffd76e1f237bc1aab40390bbeb1c,ValueFrom:nil,},EnvVar{Name:GCP_PD_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf2ee91fb20b7873c456d2a45a997ad3e9bb9f9b879027e61fe4c413ae0d6449,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fb0c6965c3634596b8bfb56605d8b3ffca300481045a3e03524c7a37f62e3875,ValueFrom:nil,},EnvVar{Name:OPENSTACK_CINDER_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c8eb332b349325e5adabd9c1dcce16ff8f0fdb42c6385841206b80f946192d8,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fc839f06b007c8a18ff270a4677e03bf095fe8750beceeb26fa1bc3c15063ba5,ValueFrom:nil,},EnvVar{Name:OVIRT_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:961c2c0c561028c1b0ff7eb979519659bae2ad7ebeda8c31f9790dfff7bcf52c,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb9a946195c27c7175b8d8edd8de889a89902b3bd07bb0ba2c6bc9f7facb87c,ValueFrom:nil,},EnvVar{Name:MANILA_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cd655557446f01d07862cf3ec1a20ff68d6675efb7c485af5c51227444c38ffd,ValueFrom:nil,},EnvVar{Name:MANILA_NFS_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e30f12a259fe5a18431e487e5868c85753d7e22d44c34cfa9c47728a4ac95bc,ValueFrom:nil,},EnvVar{Name:PROVISIONER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:035e2d85907cee77c0fa3a52a6a65f07fd4175bb16072801c5dca7517d1298c9,ValueFrom:nil,},EnvVar{Name:ATTACHER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:81ea02ba3286c8748ae8607730de107584b888c0827f65569310957e6f73f7ff,ValueFrom:nil,},EnvVar{Name:RESIZER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:547a81df21302364dd9d6df89e6c1c665d02d891a2e3853f0747431605210186,ValueFrom:nil,},EnvVar{Name:SNAPSHOTTER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1eaf4783075f35c405b7f1eba3cde9bca9f707da7374315a64ccf764ecbbb47,ValueFrom:nil,},EnvVar{Name:NODE_DRIVER_REGISTRAR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1441b067af19141e16ea5589f525c1f99e6b4bcf91008fd480b517b251dd2dc1,ValueFrom:nil,},EnvVar{Name:LIVENESS_PROBE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:78078bbcd93ceab253456f3d551a382f6e8974f71f7c8d90aa1650aeb61065b7,ValueFrom:nil,},EnvVar{Name:VSPHERE_PROBLEM_DETECTOR_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5bc6341864a48b01600b126452dcceaeb95fe4cc951ac346ddc83a223e414cf3,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2cc1330dd935fc183a3ded96f68f265a2f4c2e5ce3ea6838171d2c146c0e69af,ValueFrom:nil,},EnvVar{Name:AZURE_DISK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1757b0d6e876ded828d41fa93b19a7c739275ebfa17883654ff0442dba9bd643,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10adf4af9eb4b3e9072b49f2298d0e5275604686d7b01a04c0a1bcb6fc19f291,ValueFrom:nil,},EnvVar{Name:AZURE_FILE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a21af582f1840c82bba649d5981193dd88a14595cd7fc37e5722b7178c8921,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f604982bcdbed79ab78ca987b7a394c2376873079e8dbf6eb987880c6675c69f,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f9f05c5864803e2582f288838678273d03e807b8b67d036a7cf378b187acc760,ValueFrom:nil,},EnvVar{Name:VMWARE_VSPHERE_SYNCER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ba412af618963fda7eed461b3934b00e62d05c87ced8bf2ec941e62e149808ad,ValueFrom:nil,},EnvVar{Name:CLUSTER_CLOUD_CONTROLLER_MANAGER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5b86ea1a69ac634d7ed1282fe01b2330ee682a1c4d0fe5c4572f36b4d654ebc,ValueFrom:nil,},EnvVar{Name:IBM_VPC_BLOCK_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a1eb5fee11695dc15d201e6115f420d6e106e15e2e9982335ed8176b504d6e6,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5443f312d8a9a14766a3b82972589cb0b1623c9649be1e6df60f1aa96aa5592f,ValueFrom:nil,},EnvVar{Name:POWERVS_BLOCK_CSI_DRIVER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d36472a77899acc21925f5cf4ec07f11dbfaedf45b6f11623aa751921a5af823,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35768a0c3eb24134dd38633e8acfc7db69ee96b2fd660e9bba3b8c996452fef7,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cluster-storage-operator-serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gwm5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-storage-operator-6fbfc8dc8f-c2xl8_openshift-cluster-storage-operator(ca2fa86b-a966-49dc-8577-d2b54b111d14): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:22.471648 master-0 kubenswrapper[7110]: E0313 01:13:22.471567 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" podUID="ca2fa86b-a966-49dc-8577-d2b54b111d14" Mar 13 01:13:22.986206 master-0 kubenswrapper[7110]: E0313 01:13:22.986115 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953" Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: E0313 01:13:22.986376 7110 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: echo "Copying system trust bundle" Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: fi Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d3571ade02a7c61123d62c53fda6a57031a52c058c0571759dc09f96b23978f,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.34_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4chtg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-7c6989d6c4-bxqp2_openshift-authentication-operator(f9b713fb-64ce-4a01-951c-1f31df62e1ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 13 01:13:22.986466 master-0 kubenswrapper[7110]: > logger="UnhandledError" Mar 13 01:13:22.987628 master-0 kubenswrapper[7110]: E0313 01:13:22.987579 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" Mar 13 01:13:23.570985 master-0 kubenswrapper[7110]: I0313 01:13:23.570939 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:23.570985 master-0 kubenswrapper[7110]: I0313 01:13:23.570989 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:23.571452 master-0 kubenswrapper[7110]: I0313 01:13:23.571036 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:23.571452 master-0 kubenswrapper[7110]: E0313 01:13:23.571152 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:23.571452 master-0 kubenswrapper[7110]: E0313 01:13:23.571234 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.571213628 +0000 UTC m=+8.856240094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:23.571584 master-0 kubenswrapper[7110]: I0313 01:13:23.571553 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:23.571621 master-0 kubenswrapper[7110]: I0313 01:13:23.571605 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:23.571680 master-0 kubenswrapper[7110]: I0313 01:13:23.571661 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:23.571903 master-0 kubenswrapper[7110]: E0313 01:13:23.571853 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:23.571939 master-0 kubenswrapper[7110]: E0313 01:13:23.571917 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:23.571972 master-0 kubenswrapper[7110]: I0313 01:13:23.571950 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:23.571972 master-0 kubenswrapper[7110]: E0313 01:13:23.571963 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.571935399 +0000 UTC m=+8.856961955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:23.572032 master-0 kubenswrapper[7110]: E0313 01:13:23.571975 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:23.572032 master-0 kubenswrapper[7110]: E0313 01:13:23.571986 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.57197693 +0000 UTC m=+8.857003516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:23.572032 master-0 kubenswrapper[7110]: E0313 01:13:23.572021 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:23.572032 master-0 kubenswrapper[7110]: I0313 01:13:23.572023 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: E0313 01:13:23.572031 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: E0313 01:13:23.572077 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: E0313 01:13:23.572091 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: E0313 01:13:23.572040 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572032971 +0000 UTC m=+8.857059437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: I0313 01:13:23.572122 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:23.572142 master-0 kubenswrapper[7110]: E0313 01:13:23.572136 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572118454 +0000 UTC m=+8.857145010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572159 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572151215 +0000 UTC m=+8.857177801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: I0313 01:13:23.572157 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572173 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572166035 +0000 UTC m=+8.857192611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: I0313 01:13:23.572192 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572204 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572190006 +0000 UTC m=+8.857216562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572233 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572262 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572270 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572296 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572285038 +0000 UTC m=+8.857311604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572326 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.572305809 +0000 UTC m=+8.857332405 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:23.572416 master-0 kubenswrapper[7110]: E0313 01:13:23.572344 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.57233446 +0000 UTC m=+8.857361056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:23.603139 master-0 kubenswrapper[7110]: E0313 01:13:23.603068 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3" Mar 13 01:13:23.603345 master-0 kubenswrapper[7110]: E0313 01:13:23.603269 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg54c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-5884b9cd56-h4kkj_openshift-etcd-operator(21cbea73-f779-43e4-b5ba-d6fa06275d34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:23.604552 master-0 kubenswrapper[7110]: E0313 01:13:23.604500 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" podUID="21cbea73-f779-43e4-b5ba-d6fa06275d34" Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: I0313 01:13:23.674855 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: I0313 01:13:23.674992 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: I0313 01:13:23.675038 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: I0313 01:13:23.675136 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: E0313 01:13:23.675343 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: E0313 01:13:23.675411 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.675390554 +0000 UTC m=+8.960417030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: E0313 01:13:23.676106 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:23.676232 master-0 kubenswrapper[7110]: E0313 01:13:23.676152 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.676130505 +0000 UTC m=+8.961156981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:23.679873 master-0 kubenswrapper[7110]: E0313 01:13:23.676791 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:23.679873 master-0 kubenswrapper[7110]: E0313 01:13:23.676838 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.676827104 +0000 UTC m=+8.961853580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:23.679873 master-0 kubenswrapper[7110]: E0313 01:13:23.676913 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:23.679873 master-0 kubenswrapper[7110]: E0313 01:13:23.676946 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.676931227 +0000 UTC m=+8.961957713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:23.776558 master-0 kubenswrapper[7110]: I0313 01:13:23.776482 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:23.776800 master-0 kubenswrapper[7110]: E0313 01:13:23.776689 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:23.776800 master-0 kubenswrapper[7110]: E0313 01:13:23.776757 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:27.776738491 +0000 UTC m=+9.061764977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:24.339577 master-0 kubenswrapper[7110]: I0313 01:13:24.339516 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:24.347506 master-0 kubenswrapper[7110]: I0313 01:13:24.347413 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:24.593131 master-0 kubenswrapper[7110]: E0313 01:13:24.592970 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789: Get \"https://quay.io/v2/openshift-release-dev/ocp-v4.0-art-dev/blobs/sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789\": context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc" Mar 13 01:13:24.594049 master-0 kubenswrapper[7110]: E0313 01:13:24.593209 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:copy-operator-controller-manifests,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc,Command:[/bin/sh],Args:[-c cp -a /openshift/manifests /operand-assets/operator-controller],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:operand-assets,ReadOnly:false,MountPath:/operand-assets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lrf2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000380000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-olm-operator-77899cf6d-ck7rt_openshift-cluster-olm-operator(61fb4b86-f978-4ae1-80bc-18d2f386cbc2): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789: Get \"https://quay.io/v2/openshift-release-dev/ocp-v4.0-art-dev/blobs/sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789\": context canceled" logger="UnhandledError" Mar 13 01:13:24.594513 master-0 kubenswrapper[7110]: E0313 01:13:24.594434 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"copy-operator-controller-manifests\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789: Get \\\"https://quay.io/v2/openshift-release-dev/ocp-v4.0-art-dev/blobs/sha256:4b0739386174541c30ab04f7d665f8b4bcc9cc5aba7df6ff75a3dab98a7fa789\\\": context canceled\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" podUID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" Mar 13 01:13:24.628908 master-0 kubenswrapper[7110]: E0313 01:13:24.628459 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460" Mar 13 01:13:24.628908 master-0 kubenswrapper[7110]: E0313 01:13:24.628695 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5n7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-qclwv_openshift-network-operator(46662e51-44af-4732-83a1-9509a579b373): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:24.630706 master-0 kubenswrapper[7110]: E0313 01:13:24.630081 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-network-operator/iptables-alerter-qclwv" podUID="46662e51-44af-4732-83a1-9509a579b373" Mar 13 01:13:25.111120 master-0 kubenswrapper[7110]: E0313 01:13:25.111041 7110 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3" Mar 13 01:13:25.111299 master-0 kubenswrapper[7110]: E0313 01:13:25.111251 7110 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5e9989ee0577e930adcd97085176343a881bf92537dda1bf0325a3b1faf96d6,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mz8jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000160000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-5685fbc7d-7nstm_openshift-cluster-storage-operator(ae44526f-5858-42a0-ba77-3a22f171456f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 01:13:25.112690 master-0 kubenswrapper[7110]: E0313 01:13:25.112656 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" podUID="ae44526f-5858-42a0-ba77-3a22f171456f" Mar 13 01:13:25.296409 master-0 kubenswrapper[7110]: I0313 01:13:25.296086 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xs8pt"] Mar 13 01:13:26.016574 master-0 kubenswrapper[7110]: I0313 01:13:26.016260 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xs8pt" event={"ID":"d5456c8b-3c98-4824-8700-a04e9c12fb2e","Type":"ContainerStarted","Data":"4e35903107c5db9cc7e1ad31b326fabe4f51fe882ba6656d7c5cc78d9dd54e9b"} Mar 13 01:13:26.017600 master-0 kubenswrapper[7110]: I0313 01:13:26.016590 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xs8pt" event={"ID":"d5456c8b-3c98-4824-8700-a04e9c12fb2e","Type":"ContainerStarted","Data":"f8dd90e4b919a4750dacd366cb8ce8129d02c4f3f75302771450ca85e994151e"} Mar 13 01:13:26.019250 master-0 kubenswrapper[7110]: I0313 01:13:26.019200 7110 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="77742db3e18b710fed8057a5ff63f6e99d45794674fb37f85d739e62dd3a751e" exitCode=0 Mar 13 01:13:26.019380 master-0 kubenswrapper[7110]: I0313 01:13:26.019257 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"77742db3e18b710fed8057a5ff63f6e99d45794674fb37f85d739e62dd3a751e"} Mar 13 01:13:26.024302 master-0 kubenswrapper[7110]: I0313 01:13:26.024250 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerStarted","Data":"757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a"} Mar 13 01:13:26.036411 master-0 kubenswrapper[7110]: I0313 01:13:26.036358 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerStarted","Data":"451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b"} Mar 13 01:13:26.466311 master-0 kubenswrapper[7110]: I0313 01:13:26.466259 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:26.466585 master-0 kubenswrapper[7110]: I0313 01:13:26.466410 7110 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:13:26.475138 master-0 kubenswrapper[7110]: I0313 01:13:26.475102 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:27.041071 master-0 kubenswrapper[7110]: I0313 01:13:27.041007 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerStarted","Data":"662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9"} Mar 13 01:13:27.043537 master-0 kubenswrapper[7110]: I0313 01:13:27.043399 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerStarted","Data":"0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8"} Mar 13 01:13:27.043537 master-0 kubenswrapper[7110]: I0313 01:13:27.043462 7110 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:13:27.043537 master-0 kubenswrapper[7110]: I0313 01:13:27.043474 7110 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:13:27.052717 master-0 kubenswrapper[7110]: I0313 01:13:27.052260 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:27.126650 master-0 kubenswrapper[7110]: I0313 01:13:27.125952 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:27.150364 master-0 kubenswrapper[7110]: I0313 01:13:27.150310 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:27.633884 master-0 kubenswrapper[7110]: I0313 01:13:27.633783 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: I0313 01:13:27.633903 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: I0313 01:13:27.633935 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: I0313 01:13:27.633983 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: E0313 01:13:27.633988 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: I0313 01:13:27.634019 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:27.634100 master-0 kubenswrapper[7110]: E0313 01:13:27.634086 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.63406243 +0000 UTC m=+16.919088966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634167 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634221 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634205454 +0000 UTC m=+16.919231920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634218 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634261 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634275 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634321 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634293186 +0000 UTC m=+16.919319732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:27.634360 master-0 kubenswrapper[7110]: E0313 01:13:27.634349 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634336987 +0000 UTC m=+16.919363583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634449 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.63443669 +0000 UTC m=+16.919463296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634440 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634488 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634511 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634531 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634552 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: I0313 01:13:27.634578 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634513 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634674 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634689 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634704 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634697698 +0000 UTC m=+16.919724164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634718 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634711838 +0000 UTC m=+16.919738304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634746 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634723578 +0000 UTC m=+16.919750074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634604 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634855 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.634844482 +0000 UTC m=+16.919871048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634666 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.635029 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.635005226 +0000 UTC m=+16.920031732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.634806 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:27.634683 master-0 kubenswrapper[7110]: E0313 01:13:27.635077 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.635064508 +0000 UTC m=+16.920091004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:27.708340 master-0 kubenswrapper[7110]: I0313 01:13:27.708295 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:27.737532 master-0 kubenswrapper[7110]: I0313 01:13:27.737486 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:27.737787 master-0 kubenswrapper[7110]: E0313 01:13:27.737758 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:27.737884 master-0 kubenswrapper[7110]: I0313 01:13:27.737835 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:27.737941 master-0 kubenswrapper[7110]: E0313 01:13:27.737852 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.737830504 +0000 UTC m=+17.022856980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:27.737980 master-0 kubenswrapper[7110]: E0313 01:13:27.737951 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:27.738009 master-0 kubenswrapper[7110]: E0313 01:13:27.737992 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.737981098 +0000 UTC m=+17.023007574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:27.738113 master-0 kubenswrapper[7110]: I0313 01:13:27.738075 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:27.738168 master-0 kubenswrapper[7110]: I0313 01:13:27.738151 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:27.738290 master-0 kubenswrapper[7110]: E0313 01:13:27.738265 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:27.738328 master-0 kubenswrapper[7110]: E0313 01:13:27.738281 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:27.738328 master-0 kubenswrapper[7110]: E0313 01:13:27.738308 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.738299497 +0000 UTC m=+17.023325973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:27.738397 master-0 kubenswrapper[7110]: E0313 01:13:27.738369 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.738347419 +0000 UTC m=+17.023373895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:27.741748 master-0 kubenswrapper[7110]: I0313 01:13:27.741718 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:27.839652 master-0 kubenswrapper[7110]: I0313 01:13:27.839549 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:27.840094 master-0 kubenswrapper[7110]: E0313 01:13:27.840062 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:27.840213 master-0 kubenswrapper[7110]: E0313 01:13:27.840162 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.840142618 +0000 UTC m=+17.125169094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:28.051099 master-0 kubenswrapper[7110]: I0313 01:13:28.050957 7110 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:13:28.059708 master-0 kubenswrapper[7110]: I0313 01:13:28.055554 7110 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:13:28.068090 master-0 kubenswrapper[7110]: I0313 01:13:28.068049 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2"] Mar 13 01:13:28.068227 master-0 kubenswrapper[7110]: E0313 01:13:28.068204 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:13:28.068227 master-0 kubenswrapper[7110]: I0313 01:13:28.068218 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:13:28.068312 master-0 kubenswrapper[7110]: E0313 01:13:28.068229 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:13:28.068312 master-0 kubenswrapper[7110]: I0313 01:13:28.068240 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:13:28.068394 master-0 kubenswrapper[7110]: I0313 01:13:28.068314 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0725849-af6c-4399-9beb-8df68d80963f" containerName="prober" Mar 13 01:13:28.068394 master-0 kubenswrapper[7110]: I0313 01:13:28.068327 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:13:28.068596 master-0 kubenswrapper[7110]: I0313 01:13:28.068577 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.071601 master-0 kubenswrapper[7110]: I0313 01:13:28.071563 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:13:28.073273 master-0 kubenswrapper[7110]: I0313 01:13:28.072999 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:13:28.073333 master-0 kubenswrapper[7110]: I0313 01:13:28.073295 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:13:28.073696 master-0 kubenswrapper[7110]: I0313 01:13:28.073664 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:13:28.073931 master-0 kubenswrapper[7110]: I0313 01:13:28.073901 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:13:28.074883 master-0 kubenswrapper[7110]: I0313 01:13:28.074850 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2"] Mar 13 01:13:28.076954 master-0 kubenswrapper[7110]: I0313 01:13:28.076917 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:13:28.144297 master-0 kubenswrapper[7110]: I0313 01:13:28.144030 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctgk\" (UniqueName: \"kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.144297 master-0 kubenswrapper[7110]: I0313 01:13:28.144100 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.144297 master-0 kubenswrapper[7110]: I0313 01:13:28.144190 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.144297 master-0 kubenswrapper[7110]: I0313 01:13:28.144248 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.144778 master-0 kubenswrapper[7110]: I0313 01:13:28.144320 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.168860 master-0 kubenswrapper[7110]: I0313 01:13:28.168120 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk"] Mar 13 01:13:28.169021 master-0 kubenswrapper[7110]: I0313 01:13:28.168962 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.175468 master-0 kubenswrapper[7110]: I0313 01:13:28.172149 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:13:28.178989 master-0 kubenswrapper[7110]: I0313 01:13:28.178753 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:13:28.179193 master-0 kubenswrapper[7110]: I0313 01:13:28.179021 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:13:28.179193 master-0 kubenswrapper[7110]: I0313 01:13:28.179042 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:13:28.179783 master-0 kubenswrapper[7110]: I0313 01:13:28.179383 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:13:28.186729 master-0 kubenswrapper[7110]: I0313 01:13:28.186621 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk"] Mar 13 01:13:28.245010 master-0 kubenswrapper[7110]: I0313 01:13:28.244958 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.245237 master-0 kubenswrapper[7110]: I0313 01:13:28.245031 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vbvs\" (UniqueName: \"kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.245237 master-0 kubenswrapper[7110]: I0313 01:13:28.245101 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctgk\" (UniqueName: \"kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.245237 master-0 kubenswrapper[7110]: E0313 01:13:28.245122 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 13 01:13:28.245237 master-0 kubenswrapper[7110]: I0313 01:13:28.245149 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.245439 master-0 kubenswrapper[7110]: E0313 01:13:28.245252 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.745225255 +0000 UTC m=+10.030251741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "openshift-global-ca" not found Mar 13 01:13:28.245439 master-0 kubenswrapper[7110]: I0313 01:13:28.245416 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.245560 master-0 kubenswrapper[7110]: I0313 01:13:28.245453 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.245560 master-0 kubenswrapper[7110]: I0313 01:13:28.245538 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.245667 master-0 kubenswrapper[7110]: I0313 01:13:28.245600 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.245667 master-0 kubenswrapper[7110]: I0313 01:13:28.245654 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.245840 master-0 kubenswrapper[7110]: E0313 01:13:28.245778 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 13 01:13:28.246091 master-0 kubenswrapper[7110]: E0313 01:13:28.246070 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:28.246143 master-0 kubenswrapper[7110]: E0313 01:13:28.246074 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:28.246143 master-0 kubenswrapper[7110]: E0313 01:13:28.246125 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.746094089 +0000 UTC m=+10.031120555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "config" not found Mar 13 01:13:28.246239 master-0 kubenswrapper[7110]: E0313 01:13:28.246160 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.746152991 +0000 UTC m=+10.031179457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "client-ca" not found Mar 13 01:13:28.246239 master-0 kubenswrapper[7110]: E0313 01:13:28.246201 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.746175612 +0000 UTC m=+10.031202118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : secret "serving-cert" not found Mar 13 01:13:28.285662 master-0 kubenswrapper[7110]: I0313 01:13:28.285250 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctgk\" (UniqueName: \"kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.333907 master-0 kubenswrapper[7110]: I0313 01:13:28.333795 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:28.346983 master-0 kubenswrapper[7110]: I0313 01:13:28.346947 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.347091 master-0 kubenswrapper[7110]: I0313 01:13:28.346986 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.347091 master-0 kubenswrapper[7110]: I0313 01:13:28.347033 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.347290 master-0 kubenswrapper[7110]: E0313 01:13:28.347243 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 13 01:13:28.347403 master-0 kubenswrapper[7110]: E0313 01:13:28.347386 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.847358024 +0000 UTC m=+10.132384520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : configmap "config" not found Mar 13 01:13:28.348585 master-0 kubenswrapper[7110]: E0313 01:13:28.348565 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:28.348646 master-0 kubenswrapper[7110]: E0313 01:13:28.348620 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.848604549 +0000 UTC m=+10.133631015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : configmap "client-ca" not found Mar 13 01:13:28.348646 master-0 kubenswrapper[7110]: I0313 01:13:28.348556 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbvs\" (UniqueName: \"kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.348751 master-0 kubenswrapper[7110]: E0313 01:13:28.348736 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:28.348787 master-0 kubenswrapper[7110]: E0313 01:13:28.348760 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:28.848754263 +0000 UTC m=+10.133780729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : secret "serving-cert" not found Mar 13 01:13:28.372336 master-0 kubenswrapper[7110]: I0313 01:13:28.372271 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:28.378219 master-0 kubenswrapper[7110]: I0313 01:13:28.378175 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vbvs\" (UniqueName: \"kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.378337 master-0 kubenswrapper[7110]: I0313 01:13:28.378290 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:13:28.628422 master-0 kubenswrapper[7110]: I0313 01:13:28.627718 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:28.661291 master-0 kubenswrapper[7110]: I0313 01:13:28.661253 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:13:28.754542 master-0 kubenswrapper[7110]: I0313 01:13:28.754479 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.754836 master-0 kubenswrapper[7110]: E0313 01:13:28.754742 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 13 01:13:28.754919 master-0 kubenswrapper[7110]: E0313 01:13:28.754847 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 13 01:13:28.754919 master-0 kubenswrapper[7110]: E0313 01:13:28.754871 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.754841308 +0000 UTC m=+11.039867804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "openshift-global-ca" not found Mar 13 01:13:28.754919 master-0 kubenswrapper[7110]: I0313 01:13:28.754753 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.755112 master-0 kubenswrapper[7110]: E0313 01:13:28.754990 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.754961872 +0000 UTC m=+11.039988368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "config" not found Mar 13 01:13:28.755186 master-0 kubenswrapper[7110]: I0313 01:13:28.755142 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.755259 master-0 kubenswrapper[7110]: I0313 01:13:28.755216 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:28.755371 master-0 kubenswrapper[7110]: E0313 01:13:28.755343 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:28.755451 master-0 kubenswrapper[7110]: E0313 01:13:28.755393 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.755379894 +0000 UTC m=+11.040406390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "client-ca" not found Mar 13 01:13:28.755519 master-0 kubenswrapper[7110]: E0313 01:13:28.755491 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:28.755584 master-0 kubenswrapper[7110]: E0313 01:13:28.755548 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.755531708 +0000 UTC m=+11.040558214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : secret "serving-cert" not found Mar 13 01:13:28.856190 master-0 kubenswrapper[7110]: I0313 01:13:28.856125 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.856400 master-0 kubenswrapper[7110]: E0313 01:13:28.856310 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:28.856400 master-0 kubenswrapper[7110]: E0313 01:13:28.856394 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.85637629 +0000 UTC m=+11.141402756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : configmap "client-ca" not found Mar 13 01:13:28.856767 master-0 kubenswrapper[7110]: I0313 01:13:28.856720 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.856829 master-0 kubenswrapper[7110]: E0313 01:13:28.856792 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: configmap "config" not found Mar 13 01:13:28.856829 master-0 kubenswrapper[7110]: I0313 01:13:28.856805 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:28.856901 master-0 kubenswrapper[7110]: E0313 01:13:28.856817 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.856810182 +0000 UTC m=+11.141836638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : configmap "config" not found Mar 13 01:13:28.856901 master-0 kubenswrapper[7110]: E0313 01:13:28.856868 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:28.856901 master-0 kubenswrapper[7110]: E0313 01:13:28.856893 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:29.856887565 +0000 UTC m=+11.141914031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : secret "serving-cert" not found Mar 13 01:13:29.057955 master-0 kubenswrapper[7110]: I0313 01:13:29.057687 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7"} Mar 13 01:13:29.059004 master-0 kubenswrapper[7110]: I0313 01:13:29.058455 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:29.770209 master-0 kubenswrapper[7110]: I0313 01:13:29.770138 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:29.770543 master-0 kubenswrapper[7110]: E0313 01:13:29.770490 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.770454124 +0000 UTC m=+13.055480680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap references non-existent config key: ca-bundle.crt Mar 13 01:13:29.770626 master-0 kubenswrapper[7110]: I0313 01:13:29.770593 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:29.770801 master-0 kubenswrapper[7110]: I0313 01:13:29.770775 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:29.770996 master-0 kubenswrapper[7110]: I0313 01:13:29.770942 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:29.771195 master-0 kubenswrapper[7110]: E0313 01:13:29.771146 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:29.771195 master-0 kubenswrapper[7110]: E0313 01:13:29.771175 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:29.771284 master-0 kubenswrapper[7110]: E0313 01:13:29.771242 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.771211445 +0000 UTC m=+13.056237951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : configmap "client-ca" not found Mar 13 01:13:29.771284 master-0 kubenswrapper[7110]: E0313 01:13:29.771278 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert podName:c2443269-eac7-4808-8774-1b8993963ee0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.771259896 +0000 UTC m=+13.056286502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert") pod "controller-manager-6f7fd6c796-d7hx2" (UID: "c2443269-eac7-4808-8774-1b8993963ee0") : secret "serving-cert" not found Mar 13 01:13:29.772045 master-0 kubenswrapper[7110]: I0313 01:13:29.772013 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"controller-manager-6f7fd6c796-d7hx2\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:29.868994 master-0 kubenswrapper[7110]: I0313 01:13:29.868945 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2"] Mar 13 01:13:29.869208 master-0 kubenswrapper[7110]: E0313 01:13:29.869182 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" podUID="c2443269-eac7-4808-8774-1b8993963ee0" Mar 13 01:13:29.871650 master-0 kubenswrapper[7110]: I0313 01:13:29.871604 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:29.871650 master-0 kubenswrapper[7110]: I0313 01:13:29.871646 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:29.871766 master-0 kubenswrapper[7110]: I0313 01:13:29.871688 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:29.871896 master-0 kubenswrapper[7110]: E0313 01:13:29.871860 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:29.871954 master-0 kubenswrapper[7110]: E0313 01:13:29.871902 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.871890772 +0000 UTC m=+13.156917238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : configmap "client-ca" not found Mar 13 01:13:29.872815 master-0 kubenswrapper[7110]: I0313 01:13:29.872796 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"route-controller-manager-58959cd4d6-j9tlk\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:29.872881 master-0 kubenswrapper[7110]: E0313 01:13:29.872865 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:29.872916 master-0 kubenswrapper[7110]: E0313 01:13:29.872889 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert podName:c7d8c84a-e4b2-4099-9802-cb7a1f5906f1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.872880289 +0000 UTC m=+13.157906755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert") pod "route-controller-manager-58959cd4d6-j9tlk" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1") : secret "serving-cert" not found Mar 13 01:13:29.883537 master-0 kubenswrapper[7110]: I0313 01:13:29.883485 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk"] Mar 13 01:13:29.883740 master-0 kubenswrapper[7110]: E0313 01:13:29.883717 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" podUID="c7d8c84a-e4b2-4099-9802-cb7a1f5906f1" Mar 13 01:13:30.063485 master-0 kubenswrapper[7110]: I0313 01:13:30.063218 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:30.063485 master-0 kubenswrapper[7110]: I0313 01:13:30.063345 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:30.075767 master-0 kubenswrapper[7110]: I0313 01:13:30.075710 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:30.083205 master-0 kubenswrapper[7110]: I0313 01:13:30.083159 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:30.176041 master-0 kubenswrapper[7110]: I0313 01:13:30.175990 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ctgk\" (UniqueName: \"kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk\") pod \"c2443269-eac7-4808-8774-1b8993963ee0\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " Mar 13 01:13:30.176250 master-0 kubenswrapper[7110]: I0313 01:13:30.176127 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vbvs\" (UniqueName: \"kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs\") pod \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " Mar 13 01:13:30.176250 master-0 kubenswrapper[7110]: I0313 01:13:30.176210 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") pod \"c2443269-eac7-4808-8774-1b8993963ee0\" (UID: \"c2443269-eac7-4808-8774-1b8993963ee0\") " Mar 13 01:13:30.177159 master-0 kubenswrapper[7110]: I0313 01:13:30.177111 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config" (OuterVolumeSpecName: "config") pod "c2443269-eac7-4808-8774-1b8993963ee0" (UID: "c2443269-eac7-4808-8774-1b8993963ee0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:30.177239 master-0 kubenswrapper[7110]: I0313 01:13:30.177153 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") pod \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\" (UID: \"c7d8c84a-e4b2-4099-9802-cb7a1f5906f1\") " Mar 13 01:13:30.177904 master-0 kubenswrapper[7110]: I0313 01:13:30.177838 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config" (OuterVolumeSpecName: "config") pod "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:30.178240 master-0 kubenswrapper[7110]: I0313 01:13:30.178211 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:30.178297 master-0 kubenswrapper[7110]: I0313 01:13:30.178248 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:30.182203 master-0 kubenswrapper[7110]: I0313 01:13:30.182152 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs" (OuterVolumeSpecName: "kube-api-access-2vbvs") pod "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1" (UID: "c7d8c84a-e4b2-4099-9802-cb7a1f5906f1"). InnerVolumeSpecName "kube-api-access-2vbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:13:30.182387 master-0 kubenswrapper[7110]: I0313 01:13:30.182344 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk" (OuterVolumeSpecName: "kube-api-access-4ctgk") pod "c2443269-eac7-4808-8774-1b8993963ee0" (UID: "c2443269-eac7-4808-8774-1b8993963ee0"). InnerVolumeSpecName "kube-api-access-4ctgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:13:30.279565 master-0 kubenswrapper[7110]: I0313 01:13:30.279453 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vbvs\" (UniqueName: \"kubernetes.io/projected/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-kube-api-access-2vbvs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:30.279565 master-0 kubenswrapper[7110]: I0313 01:13:30.279537 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ctgk\" (UniqueName: \"kubernetes.io/projected/c2443269-eac7-4808-8774-1b8993963ee0-kube-api-access-4ctgk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.067273 master-0 kubenswrapper[7110]: I0313 01:13:31.067201 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2" Mar 13 01:13:31.068331 master-0 kubenswrapper[7110]: I0313 01:13:31.067226 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk" Mar 13 01:13:31.108705 master-0 kubenswrapper[7110]: I0313 01:13:31.108596 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk"] Mar 13 01:13:31.111544 master-0 kubenswrapper[7110]: I0313 01:13:31.110973 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv"] Mar 13 01:13:31.111544 master-0 kubenswrapper[7110]: I0313 01:13:31.111502 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.112986 master-0 kubenswrapper[7110]: I0313 01:13:31.112928 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58959cd4d6-j9tlk"] Mar 13 01:13:31.114405 master-0 kubenswrapper[7110]: I0313 01:13:31.114361 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:13:31.115613 master-0 kubenswrapper[7110]: I0313 01:13:31.114466 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:13:31.118156 master-0 kubenswrapper[7110]: I0313 01:13:31.118062 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:13:31.118156 master-0 kubenswrapper[7110]: I0313 01:13:31.118091 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:13:31.119497 master-0 kubenswrapper[7110]: I0313 01:13:31.119419 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:13:31.127435 master-0 kubenswrapper[7110]: I0313 01:13:31.127193 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv"] Mar 13 01:13:31.160249 master-0 kubenswrapper[7110]: I0313 01:13:31.160200 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2"] Mar 13 01:13:31.166962 master-0 kubenswrapper[7110]: I0313 01:13:31.166909 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d7hx2"] Mar 13 01:13:31.293512 master-0 kubenswrapper[7110]: I0313 01:13:31.293370 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.293759 master-0 kubenswrapper[7110]: I0313 01:13:31.293511 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.293759 master-0 kubenswrapper[7110]: I0313 01:13:31.293684 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5fz\" (UniqueName: \"kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.294152 master-0 kubenswrapper[7110]: I0313 01:13:31.294070 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.294233 master-0 kubenswrapper[7110]: I0313 01:13:31.294219 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.294309 master-0 kubenswrapper[7110]: I0313 01:13:31.294249 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.294309 master-0 kubenswrapper[7110]: I0313 01:13:31.294270 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2443269-eac7-4808-8774-1b8993963ee0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.294309 master-0 kubenswrapper[7110]: I0313 01:13:31.294290 7110 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2443269-eac7-4808-8774-1b8993963ee0-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.294309 master-0 kubenswrapper[7110]: I0313 01:13:31.294312 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:31.395883 master-0 kubenswrapper[7110]: I0313 01:13:31.395806 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.396163 master-0 kubenswrapper[7110]: I0313 01:13:31.395902 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.396163 master-0 kubenswrapper[7110]: I0313 01:13:31.396018 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5fz\" (UniqueName: \"kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.396163 master-0 kubenswrapper[7110]: E0313 01:13:31.396106 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:31.396437 master-0 kubenswrapper[7110]: E0313 01:13:31.396232 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.896199385 +0000 UTC m=+13.181225911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:31.396539 master-0 kubenswrapper[7110]: I0313 01:13:31.396495 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.396748 master-0 kubenswrapper[7110]: E0313 01:13:31.396715 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:31.396853 master-0 kubenswrapper[7110]: E0313 01:13:31.396789 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:31.89676817 +0000 UTC m=+13.181794646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:31.397296 master-0 kubenswrapper[7110]: I0313 01:13:31.397246 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.424570 master-0 kubenswrapper[7110]: I0313 01:13:31.424462 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5fz\" (UniqueName: \"kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.652889 master-0 kubenswrapper[7110]: I0313 01:13:31.652752 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85b4d45f77-rw9cf"] Mar 13 01:13:31.653489 master-0 kubenswrapper[7110]: I0313 01:13:31.653460 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.656459 master-0 kubenswrapper[7110]: I0313 01:13:31.656401 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:13:31.656981 master-0 kubenswrapper[7110]: I0313 01:13:31.656934 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:13:31.657616 master-0 kubenswrapper[7110]: I0313 01:13:31.657577 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:13:31.657860 master-0 kubenswrapper[7110]: I0313 01:13:31.657823 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:13:31.665883 master-0 kubenswrapper[7110]: I0313 01:13:31.665836 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85b4d45f77-rw9cf"] Mar 13 01:13:31.669119 master-0 kubenswrapper[7110]: I0313 01:13:31.669084 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:13:31.671048 master-0 kubenswrapper[7110]: I0313 01:13:31.670992 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:13:31.724281 master-0 kubenswrapper[7110]: I0313 01:13:31.724225 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.724525 master-0 kubenswrapper[7110]: I0313 01:13:31.724489 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.724575 master-0 kubenswrapper[7110]: I0313 01:13:31.724557 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.724653 master-0 kubenswrapper[7110]: I0313 01:13:31.724612 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.724727 master-0 kubenswrapper[7110]: I0313 01:13:31.724699 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndxt\" (UniqueName: \"kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.825621 master-0 kubenswrapper[7110]: I0313 01:13:31.825514 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.825913 master-0 kubenswrapper[7110]: E0313 01:13:31.825777 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:31.825913 master-0 kubenswrapper[7110]: I0313 01:13:31.825780 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndxt\" (UniqueName: \"kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.826045 master-0 kubenswrapper[7110]: E0313 01:13:31.825918 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:32.32586634 +0000 UTC m=+13.610892866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : secret "serving-cert" not found Mar 13 01:13:31.826045 master-0 kubenswrapper[7110]: I0313 01:13:31.826007 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.826197 master-0 kubenswrapper[7110]: I0313 01:13:31.826124 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.826197 master-0 kubenswrapper[7110]: I0313 01:13:31.826155 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.826327 master-0 kubenswrapper[7110]: E0313 01:13:31.826211 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:31.826327 master-0 kubenswrapper[7110]: E0313 01:13:31.826264 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:32.326249491 +0000 UTC m=+13.611275967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:31.827516 master-0 kubenswrapper[7110]: I0313 01:13:31.827448 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.827741 master-0 kubenswrapper[7110]: I0313 01:13:31.827525 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.848565 master-0 kubenswrapper[7110]: I0313 01:13:31.848496 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndxt\" (UniqueName: \"kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:31.927001 master-0 kubenswrapper[7110]: I0313 01:13:31.926941 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.927284 master-0 kubenswrapper[7110]: E0313 01:13:31.927225 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:31.927390 master-0 kubenswrapper[7110]: I0313 01:13:31.927350 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:31.927508 master-0 kubenswrapper[7110]: E0313 01:13:31.927391 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:32.927351401 +0000 UTC m=+14.212377927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:31.927563 master-0 kubenswrapper[7110]: E0313 01:13:31.927537 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:31.927744 master-0 kubenswrapper[7110]: E0313 01:13:31.927708 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:32.927613798 +0000 UTC m=+14.212640294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:32.331126 master-0 kubenswrapper[7110]: I0313 01:13:32.330989 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:32.331126 master-0 kubenswrapper[7110]: I0313 01:13:32.331088 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:32.332011 master-0 kubenswrapper[7110]: E0313 01:13:32.331303 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:32.332011 master-0 kubenswrapper[7110]: E0313 01:13:32.331360 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:32.332011 master-0 kubenswrapper[7110]: E0313 01:13:32.331417 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:33.331388069 +0000 UTC m=+14.616414565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : secret "serving-cert" not found Mar 13 01:13:32.332011 master-0 kubenswrapper[7110]: E0313 01:13:32.331459 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:33.33143713 +0000 UTC m=+14.616463746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:32.774086 master-0 kubenswrapper[7110]: I0313 01:13:32.774019 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: I0313 01:13:32.938452 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: I0313 01:13:32.938600 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: E0313 01:13:32.938754 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: E0313 01:13:32.938827 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:34.938805579 +0000 UTC m=+16.223832085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: E0313 01:13:32.939665 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:33.204415 master-0 kubenswrapper[7110]: E0313 01:13:32.939717 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:34.939699634 +0000 UTC m=+16.224726140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:33.213024 master-0 kubenswrapper[7110]: I0313 01:13:33.212981 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2443269-eac7-4808-8774-1b8993963ee0" path="/var/lib/kubelet/pods/c2443269-eac7-4808-8774-1b8993963ee0/volumes" Mar 13 01:13:33.213554 master-0 kubenswrapper[7110]: I0313 01:13:33.213522 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d8c84a-e4b2-4099-9802-cb7a1f5906f1" path="/var/lib/kubelet/pods/c7d8c84a-e4b2-4099-9802-cb7a1f5906f1/volumes" Mar 13 01:13:33.344316 master-0 kubenswrapper[7110]: I0313 01:13:33.344247 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:33.344772 master-0 kubenswrapper[7110]: E0313 01:13:33.344457 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:33.344772 master-0 kubenswrapper[7110]: E0313 01:13:33.344564 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.344533784 +0000 UTC m=+16.629560280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:33.344964 master-0 kubenswrapper[7110]: I0313 01:13:33.344927 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:33.345196 master-0 kubenswrapper[7110]: E0313 01:13:33.345153 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:33.345286 master-0 kubenswrapper[7110]: E0313 01:13:33.345265 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:35.345239794 +0000 UTC m=+16.630266300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : secret "serving-cert" not found Mar 13 01:13:34.215094 master-0 kubenswrapper[7110]: I0313 01:13:34.215034 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerStarted","Data":"a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286"} Mar 13 01:13:34.878160 master-0 kubenswrapper[7110]: I0313 01:13:34.877799 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc"] Mar 13 01:13:34.879903 master-0 kubenswrapper[7110]: I0313 01:13:34.879853 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:13:34.888333 master-0 kubenswrapper[7110]: I0313 01:13:34.888297 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 01:13:34.889167 master-0 kubenswrapper[7110]: I0313 01:13:34.889146 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 01:13:34.903130 master-0 kubenswrapper[7110]: I0313 01:13:34.903079 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc"] Mar 13 01:13:34.964297 master-0 kubenswrapper[7110]: I0313 01:13:34.964233 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:34.964496 master-0 kubenswrapper[7110]: I0313 01:13:34.964452 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:34.964576 master-0 kubenswrapper[7110]: I0313 01:13:34.964506 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg77t\" (UniqueName: \"kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t\") pod \"migrator-57ccdf9b5-kxxzc\" (UID: \"64477504-5cb6-42dc-a7eb-662981daec4a\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:13:34.964887 master-0 kubenswrapper[7110]: E0313 01:13:34.964857 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:34.964955 master-0 kubenswrapper[7110]: E0313 01:13:34.964919 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.964900014 +0000 UTC m=+20.249926490 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:34.965335 master-0 kubenswrapper[7110]: E0313 01:13:34.965292 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:34.965408 master-0 kubenswrapper[7110]: E0313 01:13:34.965382 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.965358657 +0000 UTC m=+20.250385193 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:35.065493 master-0 kubenswrapper[7110]: I0313 01:13:35.065455 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg77t\" (UniqueName: \"kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t\") pod \"migrator-57ccdf9b5-kxxzc\" (UID: \"64477504-5cb6-42dc-a7eb-662981daec4a\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:13:35.093793 master-0 kubenswrapper[7110]: I0313 01:13:35.093728 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg77t\" (UniqueName: \"kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t\") pod \"migrator-57ccdf9b5-kxxzc\" (UID: \"64477504-5cb6-42dc-a7eb-662981daec4a\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:13:35.206900 master-0 kubenswrapper[7110]: I0313 01:13:35.206833 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:13:35.226506 master-0 kubenswrapper[7110]: I0313 01:13:35.226459 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"c60b1887494d08fb5df2490e135e1d701bdbe7b6a6e136c3d75f17211fbf551b"} Mar 13 01:13:35.371774 master-0 kubenswrapper[7110]: I0313 01:13:35.371739 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:35.371988 master-0 kubenswrapper[7110]: E0313 01:13:35.371949 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:35.372167 master-0 kubenswrapper[7110]: I0313 01:13:35.372147 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:35.372401 master-0 kubenswrapper[7110]: E0313 01:13:35.372376 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.372343368 +0000 UTC m=+20.657369844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : secret "serving-cert" not found Mar 13 01:13:35.372585 master-0 kubenswrapper[7110]: E0313 01:13:35.372560 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:35.372663 master-0 kubenswrapper[7110]: E0313 01:13:35.372596 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.372587115 +0000 UTC m=+20.657613571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:35.476137 master-0 kubenswrapper[7110]: I0313 01:13:35.476046 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc"] Mar 13 01:13:35.483273 master-0 kubenswrapper[7110]: W0313 01:13:35.483231 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64477504_5cb6_42dc_a7eb_662981daec4a.slice/crio-a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973 WatchSource:0}: Error finding container a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973: Status 404 returned error can't find the container with id a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973 Mar 13 01:13:35.677656 master-0 kubenswrapper[7110]: I0313 01:13:35.677552 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:35.677914 master-0 kubenswrapper[7110]: I0313 01:13:35.677735 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:35.677914 master-0 kubenswrapper[7110]: I0313 01:13:35.677839 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:35.678047 master-0 kubenswrapper[7110]: I0313 01:13:35.677908 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:35.678047 master-0 kubenswrapper[7110]: I0313 01:13:35.678005 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:35.678154 master-0 kubenswrapper[7110]: I0313 01:13:35.678094 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: I0313 01:13:35.678193 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: I0313 01:13:35.678267 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: I0313 01:13:35.678328 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678392 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678414 7110 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678471 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678452435 +0000 UTC m=+32.963478901 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-webhook-server-cert" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678500 7110 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678514 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls podName:70c097a1-90d9-4344-b0ae-5a59ec2ad8ad nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678488486 +0000 UTC m=+32.963515022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls") pod "ingress-operator-677db989d6-kdn2l" (UID: "70c097a1-90d9-4344-b0ae-5a59ec2ad8ad") : secret "metrics-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678499 7110 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678551 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678569 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls podName:f97819d0-2840-4352-a435-19ef1a8c22c9 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678556168 +0000 UTC m=+32.963582754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-jjdk8" (UID: "f97819d0-2840-4352-a435-19ef1a8c22c9") : secret "image-registry-operator-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678598 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls podName:13fac7b0-ce55-467d-9d0c-6a122d87cb3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678588009 +0000 UTC m=+32.963614705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls") pod "dns-operator-589895fbb7-qvl2k" (UID: "13fac7b0-ce55-467d-9d0c-6a122d87cb3c") : secret "metrics-tls" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678618 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.67860876 +0000 UTC m=+32.963635336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:35.678675 master-0 kubenswrapper[7110]: E0313 01:13:35.678679 7110 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: E0313 01:13:35.678755 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: E0313 01:13:35.678793 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls podName:8c6bf2d5-1881-4b63-b247-7e7426707fa1 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678761404 +0000 UTC m=+32.963787970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-47sjr" (UID: "8c6bf2d5-1881-4b63-b247-7e7426707fa1") : secret "cluster-baremetal-operator-tls" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: E0313 01:13:35.678812 7110 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: E0313 01:13:35.678829 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678807975 +0000 UTC m=+32.963834471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: E0313 01:13:35.678856 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert podName:bbf0bd4d-3387-43c3-b9d5-61a044fa2138 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.678843256 +0000 UTC m=+32.963869752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert") pod "cluster-version-operator-745944c6b7-zc6gt" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138") : secret "cluster-version-operator-serving-cert" not found Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: I0313 01:13:35.678938 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:35.679797 master-0 kubenswrapper[7110]: I0313 01:13:35.678977 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.679970 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.679971 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.680001 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.679992778 +0000 UTC m=+32.965019244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.680034 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.680015139 +0000 UTC m=+32.965041635 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.680089 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:35.680503 master-0 kubenswrapper[7110]: E0313 01:13:35.680195 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.680166013 +0000 UTC m=+32.965192549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:35.780538 master-0 kubenswrapper[7110]: I0313 01:13:35.780421 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:35.780722 master-0 kubenswrapper[7110]: I0313 01:13:35.780623 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:35.780920 master-0 kubenswrapper[7110]: I0313 01:13:35.780883 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:35.780992 master-0 kubenswrapper[7110]: I0313 01:13:35.780967 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:35.781291 master-0 kubenswrapper[7110]: E0313 01:13:35.781247 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 01:13:35.781430 master-0 kubenswrapper[7110]: E0313 01:13:35.781342 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.781314214 +0000 UTC m=+33.066340720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "node-tuning-operator-tls" not found Mar 13 01:13:35.782117 master-0 kubenswrapper[7110]: E0313 01:13:35.782083 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:35.782179 master-0 kubenswrapper[7110]: E0313 01:13:35.782166 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.782141987 +0000 UTC m=+33.067168493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:35.782317 master-0 kubenswrapper[7110]: E0313 01:13:35.782274 7110 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:35.782359 master-0 kubenswrapper[7110]: E0313 01:13:35.782339 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert podName:4c5174b9-ca9e-4917-ab3a-ca403ce4f017 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.782318252 +0000 UTC m=+33.067344748 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-4m9c9" (UID: "4c5174b9-ca9e-4917-ab3a-ca403ce4f017") : secret "performance-addon-operator-webhook-cert" not found Mar 13 01:13:35.782507 master-0 kubenswrapper[7110]: E0313 01:13:35.782469 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:35.782600 master-0 kubenswrapper[7110]: E0313 01:13:35.782580 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.782553939 +0000 UTC m=+33.067580445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:35.882711 master-0 kubenswrapper[7110]: I0313 01:13:35.882681 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:35.884305 master-0 kubenswrapper[7110]: E0313 01:13:35.883795 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:35.884305 master-0 kubenswrapper[7110]: E0313 01:13:35.883870 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.883857654 +0000 UTC m=+33.168884120 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:36.239905 master-0 kubenswrapper[7110]: I0313 01:13:36.239843 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973"} Mar 13 01:13:37.244981 master-0 kubenswrapper[7110]: I0313 01:13:37.244925 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerStarted","Data":"4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24"} Mar 13 01:13:37.986035 master-0 kubenswrapper[7110]: I0313 01:13:37.985617 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-m4bbb"] Mar 13 01:13:37.987312 master-0 kubenswrapper[7110]: I0313 01:13:37.987270 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:37.998504 master-0 kubenswrapper[7110]: I0313 01:13:37.998458 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:13:37.999071 master-0 kubenswrapper[7110]: I0313 01:13:37.999033 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:13:37.999356 master-0 kubenswrapper[7110]: I0313 01:13:37.999325 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:13:37.999356 master-0 kubenswrapper[7110]: I0313 01:13:37.999339 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:13:37.999725 master-0 kubenswrapper[7110]: I0313 01:13:37.999697 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 13 01:13:37.999811 master-0 kubenswrapper[7110]: I0313 01:13:37.999723 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:13:38.000031 master-0 kubenswrapper[7110]: I0313 01:13:37.999696 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 13 01:13:38.000168 master-0 kubenswrapper[7110]: I0313 01:13:38.000135 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:13:38.000252 master-0 kubenswrapper[7110]: I0313 01:13:38.000227 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:13:38.011671 master-0 kubenswrapper[7110]: I0313 01:13:38.008928 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-m4bbb"] Mar 13 01:13:38.012660 master-0 kubenswrapper[7110]: I0313 01:13:38.012465 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:13:38.017525 master-0 kubenswrapper[7110]: I0313 01:13:38.017396 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017525 master-0 kubenswrapper[7110]: I0313 01:13:38.017447 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017525 master-0 kubenswrapper[7110]: I0313 01:13:38.017493 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017541 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017564 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017589 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjdf\" (UniqueName: \"kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017609 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017625 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017673 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017688 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.017726 master-0 kubenswrapper[7110]: I0313 01:13:38.017711 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118395 master-0 kubenswrapper[7110]: I0313 01:13:38.118266 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118395 master-0 kubenswrapper[7110]: I0313 01:13:38.118331 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118395 master-0 kubenswrapper[7110]: I0313 01:13:38.118389 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: I0313 01:13:38.118413 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: I0313 01:13:38.118446 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: I0313 01:13:38.118514 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: E0313 01:13:38.118534 7110 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: E0313 01:13:38.118623 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.61860512 +0000 UTC m=+19.903631586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "etcd-client" not found Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: I0313 01:13:38.118656 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.118750 master-0 kubenswrapper[7110]: E0313 01:13:38.118721 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: E0313 01:13:38.118774 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.618759984 +0000 UTC m=+19.903786530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "etcd-serving-ca" not found Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: I0313 01:13:38.118823 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: I0313 01:13:38.118899 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: I0313 01:13:38.118929 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: I0313 01:13:38.118994 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119025 master-0 kubenswrapper[7110]: E0313 01:13:38.119005 7110 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 01:13:38.119274 master-0 kubenswrapper[7110]: I0313 01:13:38.119030 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119274 master-0 kubenswrapper[7110]: E0313 01:13:38.119036 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.619026802 +0000 UTC m=+19.904053268 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "serving-cert" not found Mar 13 01:13:38.119274 master-0 kubenswrapper[7110]: E0313 01:13:38.119067 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 01:13:38.119274 master-0 kubenswrapper[7110]: E0313 01:13:38.119092 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:38.619083544 +0000 UTC m=+19.904110010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "audit-0" not found Mar 13 01:13:38.119274 master-0 kubenswrapper[7110]: I0313 01:13:38.119148 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjdf\" (UniqueName: \"kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119483 master-0 kubenswrapper[7110]: I0313 01:13:38.119456 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119534 master-0 kubenswrapper[7110]: I0313 01:13:38.119466 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.119534 master-0 kubenswrapper[7110]: I0313 01:13:38.119522 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.124435 master-0 kubenswrapper[7110]: I0313 01:13:38.124390 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.137382 master-0 kubenswrapper[7110]: I0313 01:13:38.137347 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjdf\" (UniqueName: \"kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.251372 master-0 kubenswrapper[7110]: I0313 01:13:38.251281 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"64cf82ec26e7e72865d114924f8764655e701499c4313b25c26bbe5acc40878c"} Mar 13 01:13:38.251372 master-0 kubenswrapper[7110]: I0313 01:13:38.251352 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"4f49595589f1bf5fa753cf5b619410e098b50fc20413b400a546391cb2022bf0"} Mar 13 01:13:38.253464 master-0 kubenswrapper[7110]: I0313 01:13:38.253390 7110 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="4d990b5e894ae9b6e30a48b23a8c0721805ad4a05730ef2a8a80e7f39e6f738b" exitCode=0 Mar 13 01:13:38.253464 master-0 kubenswrapper[7110]: I0313 01:13:38.253445 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"4d990b5e894ae9b6e30a48b23a8c0721805ad4a05730ef2a8a80e7f39e6f738b"} Mar 13 01:13:38.270252 master-0 kubenswrapper[7110]: I0313 01:13:38.269778 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" podStartSLOduration=2.246104544 podStartE2EDuration="4.269754711s" podCreationTimestamp="2026-03-13 01:13:34 +0000 UTC" firstStartedPulling="2026-03-13 01:13:35.49316823 +0000 UTC m=+16.778194696" lastFinishedPulling="2026-03-13 01:13:37.516818397 +0000 UTC m=+18.801844863" observedRunningTime="2026-03-13 01:13:38.269097252 +0000 UTC m=+19.554123798" watchObservedRunningTime="2026-03-13 01:13:38.269754711 +0000 UTC m=+19.554781217" Mar 13 01:13:38.626764 master-0 kubenswrapper[7110]: I0313 01:13:38.626698 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.627007 master-0 kubenswrapper[7110]: I0313 01:13:38.626802 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.627007 master-0 kubenswrapper[7110]: I0313 01:13:38.626863 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.627007 master-0 kubenswrapper[7110]: I0313 01:13:38.626911 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:38.627007 master-0 kubenswrapper[7110]: E0313 01:13:38.626931 7110 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627050 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.62702554 +0000 UTC m=+20.912052046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "serving-cert" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627147 7110 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627206 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.627189255 +0000 UTC m=+20.912215741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "etcd-client" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627248 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627273 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.627264237 +0000 UTC m=+20.912290713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "audit-0" not found Mar 13 01:13:38.627293 master-0 kubenswrapper[7110]: E0313 01:13:38.627302 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 01:13:38.627666 master-0 kubenswrapper[7110]: E0313 01:13:38.627328 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:39.627319238 +0000 UTC m=+20.912345714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "etcd-serving-ca" not found Mar 13 01:13:39.031142 master-0 kubenswrapper[7110]: I0313 01:13:39.030816 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:39.031348 master-0 kubenswrapper[7110]: I0313 01:13:39.031145 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:39.031847 master-0 kubenswrapper[7110]: E0313 01:13:39.031009 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:39.031847 master-0 kubenswrapper[7110]: E0313 01:13:39.031413 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:39.031847 master-0 kubenswrapper[7110]: E0313 01:13:39.031461 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:47.031448079 +0000 UTC m=+28.316474545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:39.031847 master-0 kubenswrapper[7110]: E0313 01:13:39.031472 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:47.0314678 +0000 UTC m=+28.316494266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:39.266068 master-0 kubenswrapper[7110]: I0313 01:13:39.265988 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerStarted","Data":"d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649"} Mar 13 01:13:39.269419 master-0 kubenswrapper[7110]: I0313 01:13:39.269342 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerStarted","Data":"18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e"} Mar 13 01:13:39.442858 master-0 kubenswrapper[7110]: I0313 01:13:39.442799 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:39.443110 master-0 kubenswrapper[7110]: I0313 01:13:39.443082 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:39.443292 master-0 kubenswrapper[7110]: E0313 01:13:39.443267 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:39.443683 master-0 kubenswrapper[7110]: E0313 01:13:39.443667 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:47.443653226 +0000 UTC m=+28.728679692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:39.444120 master-0 kubenswrapper[7110]: E0313 01:13:39.443592 7110 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:39.444229 master-0 kubenswrapper[7110]: E0313 01:13:39.444216 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:13:47.444204512 +0000 UTC m=+28.729231048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : secret "serving-cert" not found Mar 13 01:13:39.646167 master-0 kubenswrapper[7110]: E0313 01:13:39.645484 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 01:13:39.646690 master-0 kubenswrapper[7110]: I0313 01:13:39.646602 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:39.646980 master-0 kubenswrapper[7110]: I0313 01:13:39.646944 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:39.647447 master-0 kubenswrapper[7110]: I0313 01:13:39.647411 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:39.647774 master-0 kubenswrapper[7110]: E0313 01:13:39.647577 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:41.647526352 +0000 UTC m=+22.932552808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "etcd-serving-ca" not found Mar 13 01:13:39.647912 master-0 kubenswrapper[7110]: I0313 01:13:39.647871 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:39.648058 master-0 kubenswrapper[7110]: E0313 01:13:39.648016 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 01:13:39.648058 master-0 kubenswrapper[7110]: E0313 01:13:39.648059 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:41.648050847 +0000 UTC m=+22.933077303 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "audit-0" not found Mar 13 01:13:39.648255 master-0 kubenswrapper[7110]: E0313 01:13:39.648209 7110 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 01:13:39.648255 master-0 kubenswrapper[7110]: E0313 01:13:39.648239 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:41.648232552 +0000 UTC m=+22.933259008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "serving-cert" not found Mar 13 01:13:39.648557 master-0 kubenswrapper[7110]: E0313 01:13:39.648530 7110 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 01:13:39.648834 master-0 kubenswrapper[7110]: E0313 01:13:39.648809 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:41.648783267 +0000 UTC m=+22.933809763 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "etcd-client" not found Mar 13 01:13:39.681397 master-0 kubenswrapper[7110]: I0313 01:13:39.681342 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-qr9tk"] Mar 13 01:13:39.681858 master-0 kubenswrapper[7110]: I0313 01:13:39.681776 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.685193 master-0 kubenswrapper[7110]: I0313 01:13:39.684558 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 01:13:39.685193 master-0 kubenswrapper[7110]: I0313 01:13:39.684801 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 01:13:39.685193 master-0 kubenswrapper[7110]: I0313 01:13:39.685061 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 01:13:39.685193 master-0 kubenswrapper[7110]: I0313 01:13:39.685149 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 01:13:39.702681 master-0 kubenswrapper[7110]: I0313 01:13:39.702575 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-qr9tk"] Mar 13 01:13:39.748664 master-0 kubenswrapper[7110]: I0313 01:13:39.748575 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rfg\" (UniqueName: \"kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.748869 master-0 kubenswrapper[7110]: I0313 01:13:39.748744 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.749008 master-0 kubenswrapper[7110]: I0313 01:13:39.748976 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.850124 master-0 kubenswrapper[7110]: I0313 01:13:39.850064 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.850330 master-0 kubenswrapper[7110]: I0313 01:13:39.850225 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rfg\" (UniqueName: \"kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.850367 master-0 kubenswrapper[7110]: I0313 01:13:39.850345 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.851600 master-0 kubenswrapper[7110]: I0313 01:13:39.851545 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.861530 master-0 kubenswrapper[7110]: I0313 01:13:39.861490 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:39.882915 master-0 kubenswrapper[7110]: I0313 01:13:39.882867 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rfg\" (UniqueName: \"kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:40.040103 master-0 kubenswrapper[7110]: I0313 01:13:40.039984 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:13:40.952734 master-0 kubenswrapper[7110]: I0313 01:13:40.952373 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-qr9tk"] Mar 13 01:13:41.281698 master-0 kubenswrapper[7110]: I0313 01:13:41.281490 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerStarted","Data":"85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43"} Mar 13 01:13:41.283203 master-0 kubenswrapper[7110]: I0313 01:13:41.283146 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" event={"ID":"ae44526f-5858-42a0-ba77-3a22f171456f","Type":"ContainerStarted","Data":"250a0e47e1f825144c66cdf0edf6dd832a93865a0f2ebf659c116a8fb949ff67"} Mar 13 01:13:41.285621 master-0 kubenswrapper[7110]: I0313 01:13:41.285565 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" event={"ID":"da44d750-31e5-46f4-b3ef-dd4384c22aaf","Type":"ContainerStarted","Data":"3745bbc5e84b13f752a8050e8fc01499f2fd5e37e8cd7566db3715cd3974a077"} Mar 13 01:13:41.285621 master-0 kubenswrapper[7110]: I0313 01:13:41.285603 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" event={"ID":"da44d750-31e5-46f4-b3ef-dd4384c22aaf","Type":"ContainerStarted","Data":"d0d7ba4bdbd45759b508d00f36d1e06281f843bb6e1de6ed64932952a8078e77"} Mar 13 01:13:41.360545 master-0 kubenswrapper[7110]: I0313 01:13:41.359807 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" podStartSLOduration=2.359779654 podStartE2EDuration="2.359779654s" podCreationTimestamp="2026-03-13 01:13:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:13:41.357699536 +0000 UTC m=+22.642726082" watchObservedRunningTime="2026-03-13 01:13:41.359779654 +0000 UTC m=+22.644806150" Mar 13 01:13:41.674212 master-0 kubenswrapper[7110]: I0313 01:13:41.674123 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:41.674474 master-0 kubenswrapper[7110]: E0313 01:13:41.674373 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 01:13:41.674474 master-0 kubenswrapper[7110]: I0313 01:13:41.674413 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:41.674610 master-0 kubenswrapper[7110]: E0313 01:13:41.674490 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:45.674462562 +0000 UTC m=+26.959489058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "etcd-serving-ca" not found Mar 13 01:13:41.674610 master-0 kubenswrapper[7110]: E0313 01:13:41.674594 7110 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 01:13:41.674791 master-0 kubenswrapper[7110]: E0313 01:13:41.674696 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:45.674672758 +0000 UTC m=+26.959699264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "etcd-client" not found Mar 13 01:13:41.674898 master-0 kubenswrapper[7110]: I0313 01:13:41.674858 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:41.675147 master-0 kubenswrapper[7110]: E0313 01:13:41.675093 7110 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 01:13:41.675240 master-0 kubenswrapper[7110]: E0313 01:13:41.675161 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:45.675145421 +0000 UTC m=+26.960171887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "serving-cert" not found Mar 13 01:13:41.675240 master-0 kubenswrapper[7110]: I0313 01:13:41.675194 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:41.675361 master-0 kubenswrapper[7110]: E0313 01:13:41.675297 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 01:13:41.675361 master-0 kubenswrapper[7110]: E0313 01:13:41.675318 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:45.675312686 +0000 UTC m=+26.960339152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "audit-0" not found Mar 13 01:13:42.292865 master-0 kubenswrapper[7110]: I0313 01:13:42.292761 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qclwv" event={"ID":"46662e51-44af-4732-83a1-9509a579b373","Type":"ContainerStarted","Data":"e5e42cc233087fd83bca82d1a9f888115d5c204c4f6708e02a100dc0a15fb91c"} Mar 13 01:13:45.661680 master-0 kubenswrapper[7110]: I0313 01:13:45.660624 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-m4bbb"] Mar 13 01:13:45.661680 master-0 kubenswrapper[7110]: E0313 01:13:45.660941 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit etcd-client etcd-serving-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" podUID="fffa10d1-2e00-44fa-be54-9bd7a9393f3c" Mar 13 01:13:45.706665 master-0 kubenswrapper[7110]: I0313 01:13:45.706250 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:13:45.711653 master-0 kubenswrapper[7110]: I0313 01:13:45.707529 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.719683 master-0 kubenswrapper[7110]: I0313 01:13:45.713483 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731284 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731424 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731468 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731514 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731545 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731623 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731717 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.731750 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: E0313 01:13:45.731862 7110 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: E0313 01:13:45.731898 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:53.731885051 +0000 UTC m=+35.016911517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : configmap "audit-0" not found Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: I0313 01:13:45.732648 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: E0313 01:13:45.732720 7110 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 01:13:45.736244 master-0 kubenswrapper[7110]: E0313 01:13:45.732743 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client podName:fffa10d1-2e00-44fa-be54-9bd7a9393f3c nodeName:}" failed. No retries permitted until 2026-03-13 01:13:53.732734985 +0000 UTC m=+35.017761451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client") pod "apiserver-74b98ff8f9-m4bbb" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c") : secret "etcd-client" not found Mar 13 01:13:45.744655 master-0 kubenswrapper[7110]: I0313 01:13:45.740248 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"apiserver-74b98ff8f9-m4bbb\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:45.815654 master-0 kubenswrapper[7110]: I0313 01:13:45.812980 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5"] Mar 13 01:13:45.815654 master-0 kubenswrapper[7110]: I0313 01:13:45.813578 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:13:45.831016 master-0 kubenswrapper[7110]: I0313 01:13:45.830466 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5"] Mar 13 01:13:45.833198 master-0 kubenswrapper[7110]: I0313 01:13:45.833149 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.833339 master-0 kubenswrapper[7110]: I0313 01:13:45.833299 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.833487 master-0 kubenswrapper[7110]: I0313 01:13:45.833462 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.833544 master-0 kubenswrapper[7110]: I0313 01:13:45.833533 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.833591 master-0 kubenswrapper[7110]: I0313 01:13:45.833578 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgvg\" (UniqueName: \"kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg\") pod \"csi-snapshot-controller-7577d6f48-2slj5\" (UID: \"3d2e7338-a6d6-4872-ab72-a4e631075ab3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:13:45.833826 master-0 kubenswrapper[7110]: I0313 01:13:45.833765 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.873257 master-0 kubenswrapper[7110]: I0313 01:13:45.873215 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:45.936714 master-0 kubenswrapper[7110]: I0313 01:13:45.936044 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgvg\" (UniqueName: \"kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg\") pod \"csi-snapshot-controller-7577d6f48-2slj5\" (UID: \"3d2e7338-a6d6-4872-ab72-a4e631075ab3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:13:45.976841 master-0 kubenswrapper[7110]: I0313 01:13:45.976804 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgvg\" (UniqueName: \"kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg\") pod \"csi-snapshot-controller-7577d6f48-2slj5\" (UID: \"3d2e7338-a6d6-4872-ab72-a4e631075ab3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:13:46.099436 master-0 kubenswrapper[7110]: I0313 01:13:46.099383 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:13:46.182506 master-0 kubenswrapper[7110]: I0313 01:13:46.182060 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:13:46.293566 master-0 kubenswrapper[7110]: I0313 01:13:46.293517 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:13:46.308202 master-0 kubenswrapper[7110]: W0313 01:13:46.308161 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd428cc1c_440b_4cb4_97d3_fe0f80b4d83b.slice/crio-57d826aba034860e3eaf21db2b258df77ed968ee0b8c07c04572dbe183f9e872 WatchSource:0}: Error finding container 57d826aba034860e3eaf21db2b258df77ed968ee0b8c07c04572dbe183f9e872: Status 404 returned error can't find the container with id 57d826aba034860e3eaf21db2b258df77ed968ee0b8c07c04572dbe183f9e872 Mar 13 01:13:46.327783 master-0 kubenswrapper[7110]: I0313 01:13:46.327729 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:46.328191 master-0 kubenswrapper[7110]: I0313 01:13:46.328161 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b","Type":"ContainerStarted","Data":"57d826aba034860e3eaf21db2b258df77ed968ee0b8c07c04572dbe183f9e872"} Mar 13 01:13:46.335490 master-0 kubenswrapper[7110]: I0313 01:13:46.335461 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:46.354441 master-0 kubenswrapper[7110]: I0313 01:13:46.354398 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5"] Mar 13 01:13:46.363036 master-0 kubenswrapper[7110]: W0313 01:13:46.362996 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2e7338_a6d6_4872_ab72_a4e631075ab3.slice/crio-a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877 WatchSource:0}: Error finding container a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877: Status 404 returned error can't find the container with id a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877 Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441029 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441093 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpjdf\" (UniqueName: \"kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441137 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441172 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441196 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441218 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441248 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441276 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.441299 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets\") pod \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\" (UID: \"fffa10d1-2e00-44fa-be54-9bd7a9393f3c\") " Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.442087 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:13:46.444074 master-0 kubenswrapper[7110]: I0313 01:13:46.442254 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:13:46.444714 master-0 kubenswrapper[7110]: I0313 01:13:46.444622 7110 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.444714 master-0 kubenswrapper[7110]: I0313 01:13:46.444677 7110 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.447946 master-0 kubenswrapper[7110]: I0313 01:13:46.447766 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config" (OuterVolumeSpecName: "config") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:46.449568 master-0 kubenswrapper[7110]: I0313 01:13:46.449505 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:46.450117 master-0 kubenswrapper[7110]: I0313 01:13:46.450008 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:46.450409 master-0 kubenswrapper[7110]: I0313 01:13:46.450361 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:46.452545 master-0 kubenswrapper[7110]: I0313 01:13:46.452471 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:13:46.452955 master-0 kubenswrapper[7110]: I0313 01:13:46.452924 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:13:46.454897 master-0 kubenswrapper[7110]: I0313 01:13:46.454870 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf" (OuterVolumeSpecName: "kube-api-access-qpjdf") pod "fffa10d1-2e00-44fa-be54-9bd7a9393f3c" (UID: "fffa10d1-2e00-44fa-be54-9bd7a9393f3c"). InnerVolumeSpecName "kube-api-access-qpjdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:13:46.546928 master-0 kubenswrapper[7110]: I0313 01:13:46.546457 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.546928 master-0 kubenswrapper[7110]: I0313 01:13:46.546922 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.547089 master-0 kubenswrapper[7110]: I0313 01:13:46.546944 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpjdf\" (UniqueName: \"kubernetes.io/projected/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-kube-api-access-qpjdf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.547089 master-0 kubenswrapper[7110]: I0313 01:13:46.546967 7110 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.547089 master-0 kubenswrapper[7110]: I0313 01:13:46.546988 7110 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.547089 master-0 kubenswrapper[7110]: I0313 01:13:46.547005 7110 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:46.547089 master-0 kubenswrapper[7110]: I0313 01:13:46.547023 7110 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.055342 master-0 kubenswrapper[7110]: I0313 01:13:47.054764 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:47.055342 master-0 kubenswrapper[7110]: I0313 01:13:47.054869 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:13:47.055342 master-0 kubenswrapper[7110]: E0313 01:13:47.055269 7110 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 01:13:47.056548 master-0 kubenswrapper[7110]: E0313 01:13:47.056407 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:47.059887 master-0 kubenswrapper[7110]: E0313 01:13:47.058753 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:03.058043888 +0000 UTC m=+44.343070384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : secret "serving-cert" not found Mar 13 01:13:47.059887 master-0 kubenswrapper[7110]: E0313 01:13:47.058818 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:03.05879896 +0000 UTC m=+44.343825466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:13:47.243691 master-0 kubenswrapper[7110]: I0313 01:13:47.242190 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85b4d45f77-rw9cf"] Mar 13 01:13:47.243691 master-0 kubenswrapper[7110]: E0313 01:13:47.242450 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" podUID="905de268-3035-4fbe-a190-30b4a944adbf" Mar 13 01:13:47.333576 master-0 kubenswrapper[7110]: I0313 01:13:47.333379 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b","Type":"ContainerStarted","Data":"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee"} Mar 13 01:13:47.334529 master-0 kubenswrapper[7110]: I0313 01:13:47.334474 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:47.334662 master-0 kubenswrapper[7110]: I0313 01:13:47.334550 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-m4bbb" Mar 13 01:13:47.334662 master-0 kubenswrapper[7110]: I0313 01:13:47.334464 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877"} Mar 13 01:13:47.352520 master-0 kubenswrapper[7110]: I0313 01:13:47.352456 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:47.371300 master-0 kubenswrapper[7110]: I0313 01:13:47.369507 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.3694892850000002 podStartE2EDuration="2.369489285s" podCreationTimestamp="2026-03-13 01:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:13:47.348687793 +0000 UTC m=+28.633714269" watchObservedRunningTime="2026-03-13 01:13:47.369489285 +0000 UTC m=+28.654515761" Mar 13 01:13:47.383258 master-0 kubenswrapper[7110]: I0313 01:13:47.383179 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:13:47.384541 master-0 kubenswrapper[7110]: I0313 01:13:47.384500 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-m4bbb"] Mar 13 01:13:47.384671 master-0 kubenswrapper[7110]: I0313 01:13:47.384600 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.386452 master-0 kubenswrapper[7110]: I0313 01:13:47.386399 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:13:47.387740 master-0 kubenswrapper[7110]: I0313 01:13:47.387693 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:13:47.387944 master-0 kubenswrapper[7110]: I0313 01:13:47.387910 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:13:47.397705 master-0 kubenswrapper[7110]: I0313 01:13:47.390477 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:13:47.397705 master-0 kubenswrapper[7110]: I0313 01:13:47.391115 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:13:47.397705 master-0 kubenswrapper[7110]: I0313 01:13:47.397028 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-m4bbb"] Mar 13 01:13:47.401164 master-0 kubenswrapper[7110]: I0313 01:13:47.391010 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 01:13:47.401325 master-0 kubenswrapper[7110]: I0313 01:13:47.391175 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 01:13:47.401414 master-0 kubenswrapper[7110]: I0313 01:13:47.392169 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:13:47.401479 master-0 kubenswrapper[7110]: I0313 01:13:47.392212 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:13:47.401550 master-0 kubenswrapper[7110]: I0313 01:13:47.393679 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:13:47.401668 master-0 kubenswrapper[7110]: I0313 01:13:47.396886 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:13:47.462262 master-0 kubenswrapper[7110]: I0313 01:13:47.462186 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config\") pod \"905de268-3035-4fbe-a190-30b4a944adbf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " Mar 13 01:13:47.462262 master-0 kubenswrapper[7110]: I0313 01:13:47.462271 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles\") pod \"905de268-3035-4fbe-a190-30b4a944adbf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " Mar 13 01:13:47.462262 master-0 kubenswrapper[7110]: I0313 01:13:47.462295 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndxt\" (UniqueName: \"kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt\") pod \"905de268-3035-4fbe-a190-30b4a944adbf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462477 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462513 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462555 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462588 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462655 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462672 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462695 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462711 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.462724 master-0 kubenswrapper[7110]: I0313 01:13:47.462740 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462774 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462800 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfc8\" (UniqueName: \"kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462823 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462855 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462884 7110 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462895 7110 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/fffa10d1-2e00-44fa-be54-9bd7a9393f3c-audit\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.462952 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "905de268-3035-4fbe-a190-30b4a944adbf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: I0313 01:13:47.463035 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config" (OuterVolumeSpecName: "config") pod "905de268-3035-4fbe-a190-30b4a944adbf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: E0313 01:13:47.463188 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:47.463704 master-0 kubenswrapper[7110]: E0313 01:13:47.463328 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca podName:905de268-3035-4fbe-a190-30b4a944adbf nodeName:}" failed. No retries permitted until 2026-03-13 01:14:03.463298991 +0000 UTC m=+44.748325457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca") pod "controller-manager-85b4d45f77-rw9cf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf") : configmap "client-ca" not found Mar 13 01:13:47.471810 master-0 kubenswrapper[7110]: I0313 01:13:47.471766 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt" (OuterVolumeSpecName: "kube-api-access-6ndxt") pod "905de268-3035-4fbe-a190-30b4a944adbf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf"). InnerVolumeSpecName "kube-api-access-6ndxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:13:47.472511 master-0 kubenswrapper[7110]: I0313 01:13:47.472422 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"controller-manager-85b4d45f77-rw9cf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:47.564758 master-0 kubenswrapper[7110]: I0313 01:13:47.564330 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") pod \"905de268-3035-4fbe-a190-30b4a944adbf\" (UID: \"905de268-3035-4fbe-a190-30b4a944adbf\") " Mar 13 01:13:47.564758 master-0 kubenswrapper[7110]: I0313 01:13:47.564616 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565147 master-0 kubenswrapper[7110]: I0313 01:13:47.564779 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565147 master-0 kubenswrapper[7110]: I0313 01:13:47.565002 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565147 master-0 kubenswrapper[7110]: I0313 01:13:47.565047 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565147 master-0 kubenswrapper[7110]: I0313 01:13:47.565098 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565147 master-0 kubenswrapper[7110]: I0313 01:13:47.565131 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565466 master-0 kubenswrapper[7110]: I0313 01:13:47.565192 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565466 master-0 kubenswrapper[7110]: I0313 01:13:47.565256 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565466 master-0 kubenswrapper[7110]: I0313 01:13:47.565308 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfc8\" (UniqueName: \"kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565466 master-0 kubenswrapper[7110]: I0313 01:13:47.565342 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565466 master-0 kubenswrapper[7110]: I0313 01:13:47.565392 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565865 master-0 kubenswrapper[7110]: I0313 01:13:47.565763 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.565972 master-0 kubenswrapper[7110]: I0313 01:13:47.565889 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.566661 master-0 kubenswrapper[7110]: I0313 01:13:47.566033 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.566661 master-0 kubenswrapper[7110]: I0313 01:13:47.566191 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.566661 master-0 kubenswrapper[7110]: I0313 01:13:47.566219 7110 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.566661 master-0 kubenswrapper[7110]: I0313 01:13:47.566240 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndxt\" (UniqueName: \"kubernetes.io/projected/905de268-3035-4fbe-a190-30b4a944adbf-kube-api-access-6ndxt\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.566661 master-0 kubenswrapper[7110]: I0313 01:13:47.566488 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.567089 master-0 kubenswrapper[7110]: I0313 01:13:47.566813 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.567414 master-0 kubenswrapper[7110]: I0313 01:13:47.567351 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.567485 master-0 kubenswrapper[7110]: I0313 01:13:47.567428 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.571018 master-0 kubenswrapper[7110]: I0313 01:13:47.570918 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "905de268-3035-4fbe-a190-30b4a944adbf" (UID: "905de268-3035-4fbe-a190-30b4a944adbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:13:47.571607 master-0 kubenswrapper[7110]: I0313 01:13:47.571566 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.572200 master-0 kubenswrapper[7110]: I0313 01:13:47.572155 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.577615 master-0 kubenswrapper[7110]: I0313 01:13:47.577548 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.585285 master-0 kubenswrapper[7110]: I0313 01:13:47.585080 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfc8\" (UniqueName: \"kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8\") pod \"apiserver-65c58d4d64-6dpp5\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:47.668478 master-0 kubenswrapper[7110]: I0313 01:13:47.667304 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/905de268-3035-4fbe-a190-30b4a944adbf-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:47.706725 master-0 kubenswrapper[7110]: I0313 01:13:47.706281 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:13:48.556435 master-0 kubenswrapper[7110]: I0313 01:13:48.555801 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85b4d45f77-rw9cf" Mar 13 01:13:48.674382 master-0 kubenswrapper[7110]: I0313 01:13:48.672316 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85b4d45f77-rw9cf"] Mar 13 01:13:48.674514 master-0 kubenswrapper[7110]: I0313 01:13:48.674472 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-85b4d45f77-rw9cf"] Mar 13 01:13:48.778590 master-0 kubenswrapper[7110]: I0313 01:13:48.778435 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/905de268-3035-4fbe-a190-30b4a944adbf-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:13:48.808443 master-0 kubenswrapper[7110]: I0313 01:13:48.808309 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:13:48.818161 master-0 kubenswrapper[7110]: W0313 01:13:48.818108 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e195ca_6747_4a22_a0b3_6bf90a06b215.slice/crio-51d73c636079d4a0f1079b7f2d1f70e13558638c5bb5766f080dd17dae1e0fb5 WatchSource:0}: Error finding container 51d73c636079d4a0f1079b7f2d1f70e13558638c5bb5766f080dd17dae1e0fb5: Status 404 returned error can't find the container with id 51d73c636079d4a0f1079b7f2d1f70e13558638c5bb5766f080dd17dae1e0fb5 Mar 13 01:13:48.912776 master-0 kubenswrapper[7110]: I0313 01:13:48.912354 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905de268-3035-4fbe-a190-30b4a944adbf" path="/var/lib/kubelet/pods/905de268-3035-4fbe-a190-30b4a944adbf/volumes" Mar 13 01:13:48.913670 master-0 kubenswrapper[7110]: I0313 01:13:48.913193 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fffa10d1-2e00-44fa-be54-9bd7a9393f3c" path="/var/lib/kubelet/pods/fffa10d1-2e00-44fa-be54-9bd7a9393f3c/volumes" Mar 13 01:13:48.965377 master-0 kubenswrapper[7110]: I0313 01:13:48.965330 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:13:49.558578 master-0 kubenswrapper[7110]: I0313 01:13:49.558513 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" event={"ID":"33e195ca-6747-4a22-a0b3-6bf90a06b215","Type":"ContainerStarted","Data":"51d73c636079d4a0f1079b7f2d1f70e13558638c5bb5766f080dd17dae1e0fb5"} Mar 13 01:13:49.560029 master-0 kubenswrapper[7110]: I0313 01:13:49.560003 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6"} Mar 13 01:13:49.573697 master-0 kubenswrapper[7110]: I0313 01:13:49.573582 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podStartSLOduration=2.321314337 podStartE2EDuration="4.573521302s" podCreationTimestamp="2026-03-13 01:13:45 +0000 UTC" firstStartedPulling="2026-03-13 01:13:46.365218827 +0000 UTC m=+27.650245313" lastFinishedPulling="2026-03-13 01:13:48.617425812 +0000 UTC m=+29.902452278" observedRunningTime="2026-03-13 01:13:49.572618546 +0000 UTC m=+30.857645092" watchObservedRunningTime="2026-03-13 01:13:49.573521302 +0000 UTC m=+30.858547798" Mar 13 01:13:50.676037 master-0 kubenswrapper[7110]: I0313 01:13:50.675876 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-748d7f7c46-r6nmm"] Mar 13 01:13:50.684843 master-0 kubenswrapper[7110]: I0313 01:13:50.677538 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.692751 master-0 kubenswrapper[7110]: I0313 01:13:50.692462 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:13:50.692948 master-0 kubenswrapper[7110]: I0313 01:13:50.692813 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:13:50.693064 master-0 kubenswrapper[7110]: I0313 01:13:50.693019 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:13:50.699951 master-0 kubenswrapper[7110]: I0313 01:13:50.697531 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:13:50.699951 master-0 kubenswrapper[7110]: I0313 01:13:50.697833 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:13:50.702799 master-0 kubenswrapper[7110]: I0313 01:13:50.702722 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748d7f7c46-r6nmm"] Mar 13 01:13:50.705864 master-0 kubenswrapper[7110]: I0313 01:13:50.705790 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:13:50.808242 master-0 kubenswrapper[7110]: I0313 01:13:50.807334 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.808242 master-0 kubenswrapper[7110]: I0313 01:13:50.807797 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.808242 master-0 kubenswrapper[7110]: I0313 01:13:50.807855 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.808242 master-0 kubenswrapper[7110]: I0313 01:13:50.807890 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c7mk\" (UniqueName: \"kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.808242 master-0 kubenswrapper[7110]: I0313 01:13:50.808071 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.909171 master-0 kubenswrapper[7110]: I0313 01:13:50.909087 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.909171 master-0 kubenswrapper[7110]: I0313 01:13:50.909180 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.909531 master-0 kubenswrapper[7110]: I0313 01:13:50.909245 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.909531 master-0 kubenswrapper[7110]: I0313 01:13:50.909282 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.909531 master-0 kubenswrapper[7110]: I0313 01:13:50.909303 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c7mk\" (UniqueName: \"kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.910780 master-0 kubenswrapper[7110]: E0313 01:13:50.910728 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:50.910893 master-0 kubenswrapper[7110]: E0313 01:13:50.910791 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:51.410775079 +0000 UTC m=+32.695801545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:13:50.914360 master-0 kubenswrapper[7110]: I0313 01:13:50.914292 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.914529 master-0 kubenswrapper[7110]: I0313 01:13:50.914429 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.920544 master-0 kubenswrapper[7110]: I0313 01:13:50.920487 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:50.934780 master-0 kubenswrapper[7110]: I0313 01:13:50.934664 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c7mk\" (UniqueName: \"kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:51.416577 master-0 kubenswrapper[7110]: I0313 01:13:51.416515 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:51.416886 master-0 kubenswrapper[7110]: E0313 01:13:51.416679 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:51.416886 master-0 kubenswrapper[7110]: E0313 01:13:51.416729 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:52.416713829 +0000 UTC m=+33.701740295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:13:51.721716 master-0 kubenswrapper[7110]: I0313 01:13:51.721314 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:13:51.721716 master-0 kubenswrapper[7110]: I0313 01:13:51.721361 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:51.721716 master-0 kubenswrapper[7110]: E0313 01:13:51.721451 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 01:13:51.721716 master-0 kubenswrapper[7110]: I0313 01:13:51.721483 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:13:51.721716 master-0 kubenswrapper[7110]: E0313 01:13:51.721513 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert podName:7f35cc1e-3376-4dbd-b215-2a32bf62cc71 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.721494779 +0000 UTC m=+65.006521245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert") pod "catalog-operator-7d9c49f57b-h46pz" (UID: "7f35cc1e-3376-4dbd-b215-2a32bf62cc71") : secret "catalog-operator-serving-cert" not found Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.721761 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: E0313 01:13:51.721799 7110 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: E0313 01:13:51.721837 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics podName:78d2cd80-23b9-426d-a7ac-1daa27668a47 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.721825008 +0000 UTC m=+65.006851544 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-dszg5" (UID: "78d2cd80-23b9-426d-a7ac-1daa27668a47") : secret "marketplace-operator-metrics" not found Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.721900 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.721920 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.721944 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.721967 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.722202 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.722305 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: I0313 01:13:51.722353 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: E0313 01:13:51.722379 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 01:13:51.722484 master-0 kubenswrapper[7110]: E0313 01:13:51.722439 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert podName:0d4e6150-432c-4a11-b5a6-4d62dd701fc8 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.722418555 +0000 UTC m=+65.007445101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-nrzpj" (UID: "0d4e6150-432c-4a11-b5a6-4d62dd701fc8") : secret "package-server-manager-serving-cert" not found Mar 13 01:13:51.723669 master-0 kubenswrapper[7110]: E0313 01:13:51.722845 7110 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:51.723669 master-0 kubenswrapper[7110]: E0313 01:13:51.722863 7110 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: secret "mco-proxy-tls" not found Mar 13 01:13:51.723669 master-0 kubenswrapper[7110]: E0313 01:13:51.722903 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls podName:4976e608-07a0-4cef-8fdd-7cec3324b4b5 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.722893478 +0000 UTC m=+65.007919944 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls") pod "machine-config-operator-fdb5c78b5-6slg8" (UID: "4976e608-07a0-4cef-8fdd-7cec3324b4b5") : secret "mco-proxy-tls" not found Mar 13 01:13:51.723669 master-0 kubenswrapper[7110]: E0313 01:13:51.722917 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls podName:7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.722910789 +0000 UTC m=+65.007937255 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-2tr2t" (UID: "7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71") : secret "cluster-monitoring-operator-tls" not found Mar 13 01:13:51.724560 master-0 kubenswrapper[7110]: I0313 01:13:51.724531 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:51.725557 master-0 kubenswrapper[7110]: I0313 01:13:51.725097 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:51.725557 master-0 kubenswrapper[7110]: I0313 01:13:51.725099 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:51.725557 master-0 kubenswrapper[7110]: I0313 01:13:51.725237 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:51.725557 master-0 kubenswrapper[7110]: I0313 01:13:51.725531 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:51.725720 master-0 kubenswrapper[7110]: I0313 01:13:51.725602 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"cluster-version-operator-745944c6b7-zc6gt\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:51.823290 master-0 kubenswrapper[7110]: I0313 01:13:51.823239 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:51.823532 master-0 kubenswrapper[7110]: I0313 01:13:51.823358 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:13:51.823532 master-0 kubenswrapper[7110]: I0313 01:13:51.823393 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:51.823532 master-0 kubenswrapper[7110]: I0313 01:13:51.823443 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:13:51.823652 master-0 kubenswrapper[7110]: E0313 01:13:51.823582 7110 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 01:13:51.823685 master-0 kubenswrapper[7110]: E0313 01:13:51.823657 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs podName:e68ab3cb-c372-45d9-a758-beaf4c213714 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.823619107 +0000 UTC m=+65.108645573 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs") pod "network-metrics-daemon-zh5fh" (UID: "e68ab3cb-c372-45d9-a758-beaf4c213714") : secret "metrics-daemon-secret" not found Mar 13 01:13:51.824291 master-0 kubenswrapper[7110]: E0313 01:13:51.823943 7110 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 01:13:51.824291 master-0 kubenswrapper[7110]: E0313 01:13:51.824283 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert podName:6c88187c-d011-4043-a6d3-4a8a7ec4e204 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.824245375 +0000 UTC m=+65.109271881 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert") pod "olm-operator-d64cfc9db-8l7kq" (UID: "6c88187c-d011-4043-a6d3-4a8a7ec4e204") : secret "olm-operator-serving-cert" not found Mar 13 01:13:51.827157 master-0 kubenswrapper[7110]: I0313 01:13:51.827123 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:51.828102 master-0 kubenswrapper[7110]: I0313 01:13:51.827993 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:51.924664 master-0 kubenswrapper[7110]: I0313 01:13:51.924460 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:13:51.925146 master-0 kubenswrapper[7110]: E0313 01:13:51.925102 7110 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 01:13:51.925252 master-0 kubenswrapper[7110]: E0313 01:13:51.925202 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs podName:2bd94289-7109-4419-9a51-bd289082b9f5 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:23.925166969 +0000 UTC m=+65.210193455 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs") pod "multus-admission-controller-8d675b596-tq7n6" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5") : secret "multus-admission-controller-secret" not found Mar 13 01:13:51.958436 master-0 kubenswrapper[7110]: I0313 01:13:51.958369 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:13:51.961581 master-0 kubenswrapper[7110]: I0313 01:13:51.961539 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:13:51.966203 master-0 kubenswrapper[7110]: I0313 01:13:51.962404 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:13:51.966203 master-0 kubenswrapper[7110]: I0313 01:13:51.962472 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:13:51.966203 master-0 kubenswrapper[7110]: I0313 01:13:51.963899 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:13:51.966203 master-0 kubenswrapper[7110]: I0313 01:13:51.964275 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:13:52.025183 master-0 kubenswrapper[7110]: W0313 01:13:52.025034 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbf0bd4d_3387_43c3_b9d5_61a044fa2138.slice/crio-09ec7b3f0f3ef6c6bfb61d7ac2e6b3febf7fbb4a21a9eaef4ee580930e6e25c7 WatchSource:0}: Error finding container 09ec7b3f0f3ef6c6bfb61d7ac2e6b3febf7fbb4a21a9eaef4ee580930e6e25c7: Status 404 returned error can't find the container with id 09ec7b3f0f3ef6c6bfb61d7ac2e6b3febf7fbb4a21a9eaef4ee580930e6e25c7 Mar 13 01:13:52.229769 master-0 kubenswrapper[7110]: I0313 01:13:52.229694 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8"] Mar 13 01:13:52.230590 master-0 kubenswrapper[7110]: I0313 01:13:52.230540 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr"] Mar 13 01:13:52.276140 master-0 kubenswrapper[7110]: I0313 01:13:52.274768 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-kdn2l"] Mar 13 01:13:52.286683 master-0 kubenswrapper[7110]: W0313 01:13:52.286026 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70c097a1_90d9_4344_b0ae_5a59ec2ad8ad.slice/crio-e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e WatchSource:0}: Error finding container e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e: Status 404 returned error can't find the container with id e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e Mar 13 01:13:52.296050 master-0 kubenswrapper[7110]: I0313 01:13:52.295927 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-qvl2k"] Mar 13 01:13:52.300022 master-0 kubenswrapper[7110]: I0313 01:13:52.296744 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9"] Mar 13 01:13:52.303252 master-0 kubenswrapper[7110]: W0313 01:13:52.301708 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fac7b0_ce55_467d_9d0c_6a122d87cb3c.slice/crio-e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96 WatchSource:0}: Error finding container e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96: Status 404 returned error can't find the container with id e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96 Mar 13 01:13:52.303252 master-0 kubenswrapper[7110]: W0313 01:13:52.302738 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5174b9_ca9e_4917_ab3a_ca403ce4f017.slice/crio-37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86 WatchSource:0}: Error finding container 37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86: Status 404 returned error can't find the container with id 37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86 Mar 13 01:13:52.430351 master-0 kubenswrapper[7110]: I0313 01:13:52.430287 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:52.430528 master-0 kubenswrapper[7110]: E0313 01:13:52.430418 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:52.430528 master-0 kubenswrapper[7110]: E0313 01:13:52.430471 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:54.430455332 +0000 UTC m=+35.715481798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:13:52.576847 master-0 kubenswrapper[7110]: I0313 01:13:52.576365 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" event={"ID":"4c5174b9-ca9e-4917-ab3a-ca403ce4f017","Type":"ContainerStarted","Data":"37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86"} Mar 13 01:13:52.578036 master-0 kubenswrapper[7110]: I0313 01:13:52.578001 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" event={"ID":"f97819d0-2840-4352-a435-19ef1a8c22c9","Type":"ContainerStarted","Data":"3f750f4eaadd11866936791933f7a3cbf786b838bf1e7a9f9142487b42787b0b"} Mar 13 01:13:52.579275 master-0 kubenswrapper[7110]: I0313 01:13:52.579242 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" event={"ID":"bbf0bd4d-3387-43c3-b9d5-61a044fa2138","Type":"ContainerStarted","Data":"09ec7b3f0f3ef6c6bfb61d7ac2e6b3febf7fbb4a21a9eaef4ee580930e6e25c7"} Mar 13 01:13:52.580954 master-0 kubenswrapper[7110]: I0313 01:13:52.580921 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96"} Mar 13 01:13:52.582405 master-0 kubenswrapper[7110]: I0313 01:13:52.582369 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e"} Mar 13 01:13:52.584432 master-0 kubenswrapper[7110]: I0313 01:13:52.584399 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"eeedfbb568950a2005b49c940a6eb5e45d4af2d8ddb401839d8110cff9f9ae07"} Mar 13 01:13:54.470731 master-0 kubenswrapper[7110]: I0313 01:13:54.470131 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:54.470731 master-0 kubenswrapper[7110]: E0313 01:13:54.470261 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:54.470731 master-0 kubenswrapper[7110]: E0313 01:13:54.470337 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:58.470319784 +0000 UTC m=+39.755346250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:13:54.710092 master-0 kubenswrapper[7110]: I0313 01:13:54.708948 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j"] Mar 13 01:13:54.710092 master-0 kubenswrapper[7110]: I0313 01:13:54.709537 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.711564 master-0 kubenswrapper[7110]: I0313 01:13:54.711429 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 01:13:54.711699 master-0 kubenswrapper[7110]: I0313 01:13:54.711588 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 01:13:54.713478 master-0 kubenswrapper[7110]: I0313 01:13:54.713431 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 01:13:54.716334 master-0 kubenswrapper[7110]: I0313 01:13:54.716304 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 01:13:54.722496 master-0 kubenswrapper[7110]: I0313 01:13:54.722421 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j"] Mar 13 01:13:54.774327 master-0 kubenswrapper[7110]: I0313 01:13:54.774254 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.777102 master-0 kubenswrapper[7110]: I0313 01:13:54.777038 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.777322 master-0 kubenswrapper[7110]: I0313 01:13:54.777263 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.777450 master-0 kubenswrapper[7110]: I0313 01:13:54.777411 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.777582 master-0 kubenswrapper[7110]: I0313 01:13:54.777541 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.777730 master-0 kubenswrapper[7110]: I0313 01:13:54.777602 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44dw\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.807595 master-0 kubenswrapper[7110]: I0313 01:13:54.806227 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w"] Mar 13 01:13:54.807595 master-0 kubenswrapper[7110]: I0313 01:13:54.806841 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.808927 master-0 kubenswrapper[7110]: I0313 01:13:54.808902 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 01:13:54.809149 master-0 kubenswrapper[7110]: I0313 01:13:54.809097 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 01:13:54.809256 master-0 kubenswrapper[7110]: I0313 01:13:54.809234 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 01:13:54.814384 master-0 kubenswrapper[7110]: I0313 01:13:54.814294 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w"] Mar 13 01:13:54.878201 master-0 kubenswrapper[7110]: I0313 01:13:54.878160 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.878461 master-0 kubenswrapper[7110]: I0313 01:13:54.878446 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.878602 master-0 kubenswrapper[7110]: I0313 01:13:54.878584 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.878719 master-0 kubenswrapper[7110]: I0313 01:13:54.878706 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.878867 master-0 kubenswrapper[7110]: I0313 01:13:54.878719 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.878867 master-0 kubenswrapper[7110]: I0313 01:13:54.878808 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.878957 master-0 kubenswrapper[7110]: I0313 01:13:54.878858 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.879043 master-0 kubenswrapper[7110]: E0313 01:13:54.879018 7110 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 13 01:13:54.879112 master-0 kubenswrapper[7110]: I0313 01:13:54.879097 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.879193 master-0 kubenswrapper[7110]: I0313 01:13:54.879180 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44dw\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.879292 master-0 kubenswrapper[7110]: I0313 01:13:54.879279 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.879380 master-0 kubenswrapper[7110]: I0313 01:13:54.879367 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.879465 master-0 kubenswrapper[7110]: E0313 01:13:54.879446 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs podName:2c4c579b-0643-47ac-a729-017c326b0ecc nodeName:}" failed. No retries permitted until 2026-03-13 01:13:55.379424754 +0000 UTC m=+36.664451260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-7fc8j" (UID: "2c4c579b-0643-47ac-a729-017c326b0ecc") : secret "catalogserver-cert" not found Mar 13 01:13:54.879752 master-0 kubenswrapper[7110]: I0313 01:13:54.879580 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v79j\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.879752 master-0 kubenswrapper[7110]: I0313 01:13:54.879670 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.881992 master-0 kubenswrapper[7110]: I0313 01:13:54.880997 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.896346 master-0 kubenswrapper[7110]: I0313 01:13:54.896290 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.903928 master-0 kubenswrapper[7110]: I0313 01:13:54.903881 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44dw\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:54.980648 master-0 kubenswrapper[7110]: I0313 01:13:54.980522 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.980648 master-0 kubenswrapper[7110]: I0313 01:13:54.980567 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v79j\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.980867 master-0 kubenswrapper[7110]: E0313 01:13:54.980730 7110 projected.go:288] Couldn't get configMap openshift-operator-controller/operator-controller-trusted-ca-bundle: configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:54.980867 master-0 kubenswrapper[7110]: E0313 01:13:54.980751 7110 projected.go:194] Error preparing data for projected volume ca-certs for pod openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w: configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:54.980867 master-0 kubenswrapper[7110]: E0313 01:13:54.980804 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs podName:30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:55.48078975 +0000 UTC m=+36.765816206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ca-certs" (UniqueName: "kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs") pod "operator-controller-controller-manager-6598bfb6c4-2wh5w" (UID: "30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3") : configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:54.980958 master-0 kubenswrapper[7110]: I0313 01:13:54.980928 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.981057 master-0 kubenswrapper[7110]: I0313 01:13:54.980926 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.981099 master-0 kubenswrapper[7110]: I0313 01:13:54.981075 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.981137 master-0 kubenswrapper[7110]: I0313 01:13:54.981024 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.981352 master-0 kubenswrapper[7110]: I0313 01:13:54.981207 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:54.981618 master-0 kubenswrapper[7110]: I0313 01:13:54.981582 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:55.004102 master-0 kubenswrapper[7110]: I0313 01:13:55.004073 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v79j\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:55.388472 master-0 kubenswrapper[7110]: I0313 01:13:55.387913 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:55.388472 master-0 kubenswrapper[7110]: E0313 01:13:55.388054 7110 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 13 01:13:55.388472 master-0 kubenswrapper[7110]: E0313 01:13:55.388110 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs podName:2c4c579b-0643-47ac-a729-017c326b0ecc nodeName:}" failed. No retries permitted until 2026-03-13 01:13:56.3880941 +0000 UTC m=+37.673120566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-7fc8j" (UID: "2c4c579b-0643-47ac-a729-017c326b0ecc") : secret "catalogserver-cert" not found Mar 13 01:13:55.457397 master-0 kubenswrapper[7110]: I0313 01:13:55.456998 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:13:55.457397 master-0 kubenswrapper[7110]: I0313 01:13:55.457192 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" containerName="installer" containerID="cri-o://24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee" gracePeriod=30 Mar 13 01:13:55.489513 master-0 kubenswrapper[7110]: I0313 01:13:55.489458 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:55.490068 master-0 kubenswrapper[7110]: E0313 01:13:55.489604 7110 projected.go:288] Couldn't get configMap openshift-operator-controller/operator-controller-trusted-ca-bundle: configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:55.490068 master-0 kubenswrapper[7110]: E0313 01:13:55.489667 7110 projected.go:194] Error preparing data for projected volume ca-certs for pod openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w: configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:55.490068 master-0 kubenswrapper[7110]: E0313 01:13:55.489731 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs podName:30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3 nodeName:}" failed. No retries permitted until 2026-03-13 01:13:56.489709344 +0000 UTC m=+37.774735880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ca-certs" (UniqueName: "kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs") pod "operator-controller-controller-manager-6598bfb6c4-2wh5w" (UID: "30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3") : configmap "operator-controller-trusted-ca-bundle" not found Mar 13 01:13:56.403525 master-0 kubenswrapper[7110]: I0313 01:13:56.403469 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:56.403750 master-0 kubenswrapper[7110]: E0313 01:13:56.403726 7110 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 13 01:13:56.403802 master-0 kubenswrapper[7110]: E0313 01:13:56.403792 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs podName:2c4c579b-0643-47ac-a729-017c326b0ecc nodeName:}" failed. No retries permitted until 2026-03-13 01:13:58.403774437 +0000 UTC m=+39.688800913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-7fc8j" (UID: "2c4c579b-0643-47ac-a729-017c326b0ecc") : secret "catalogserver-cert" not found Mar 13 01:13:56.504805 master-0 kubenswrapper[7110]: I0313 01:13:56.504743 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:56.510115 master-0 kubenswrapper[7110]: I0313 01:13:56.510086 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:56.625420 master-0 kubenswrapper[7110]: I0313 01:13:56.625365 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:13:58.066900 master-0 kubenswrapper[7110]: I0313 01:13:58.064798 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:13:58.066900 master-0 kubenswrapper[7110]: I0313 01:13:58.065376 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.084034 master-0 kubenswrapper[7110]: I0313 01:13:58.083915 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:13:58.126085 master-0 kubenswrapper[7110]: I0313 01:13:58.126014 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.126295 master-0 kubenswrapper[7110]: I0313 01:13:58.126143 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.126295 master-0 kubenswrapper[7110]: I0313 01:13:58.126188 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.228053 master-0 kubenswrapper[7110]: I0313 01:13:58.227905 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.228053 master-0 kubenswrapper[7110]: I0313 01:13:58.227983 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.228053 master-0 kubenswrapper[7110]: I0313 01:13:58.228002 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.228053 master-0 kubenswrapper[7110]: I0313 01:13:58.228053 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.228447 master-0 kubenswrapper[7110]: I0313 01:13:58.228077 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.249589 master-0 kubenswrapper[7110]: I0313 01:13:58.249488 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.336413 master-0 kubenswrapper[7110]: I0313 01:13:58.336357 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:13:58.410350 master-0 kubenswrapper[7110]: I0313 01:13:58.410281 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:13:58.430721 master-0 kubenswrapper[7110]: I0313 01:13:58.430376 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:13:58.431003 master-0 kubenswrapper[7110]: E0313 01:13:58.430727 7110 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 13 01:13:58.431003 master-0 kubenswrapper[7110]: E0313 01:13:58.430837 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs podName:2c4c579b-0643-47ac-a729-017c326b0ecc nodeName:}" failed. No retries permitted until 2026-03-13 01:14:02.430810581 +0000 UTC m=+43.715837077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-7fc8j" (UID: "2c4c579b-0643-47ac-a729-017c326b0ecc") : secret "catalogserver-cert" not found Mar 13 01:13:58.531203 master-0 kubenswrapper[7110]: I0313 01:13:58.531040 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:13:58.531203 master-0 kubenswrapper[7110]: E0313 01:13:58.531139 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:13:58.531203 master-0 kubenswrapper[7110]: E0313 01:13:58.531185 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:06.531172719 +0000 UTC m=+47.816199195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:13:59.679452 master-0 kubenswrapper[7110]: I0313 01:13:59.675471 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-78885b775b-jrrjv"] Mar 13 01:13:59.679452 master-0 kubenswrapper[7110]: I0313 01:13:59.677322 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.680858 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.680919 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.681107 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.681214 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.681288 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.681357 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.682219 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 01:13:59.686384 master-0 kubenswrapper[7110]: I0313 01:13:59.684868 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 01:13:59.687026 master-0 kubenswrapper[7110]: I0313 01:13:59.686779 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-78885b775b-jrrjv"] Mar 13 01:13:59.745597 master-0 kubenswrapper[7110]: I0313 01:13:59.745538 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745609 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745650 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745685 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745726 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd849\" (UniqueName: \"kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745750 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745773 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.746108 master-0 kubenswrapper[7110]: I0313 01:13:59.745808 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854611 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854710 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854742 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854783 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854825 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd849\" (UniqueName: \"kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854847 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854869 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.856676 master-0 kubenswrapper[7110]: I0313 01:13:59.854904 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.863662 master-0 kubenswrapper[7110]: I0313 01:13:59.863214 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.863817 master-0 kubenswrapper[7110]: I0313 01:13:59.863732 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.866662 master-0 kubenswrapper[7110]: I0313 01:13:59.864182 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.877682 master-0 kubenswrapper[7110]: I0313 01:13:59.876434 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.877682 master-0 kubenswrapper[7110]: I0313 01:13:59.876970 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.887661 master-0 kubenswrapper[7110]: I0313 01:13:59.887085 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.899843 master-0 kubenswrapper[7110]: I0313 01:13:59.896117 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:13:59.932746 master-0 kubenswrapper[7110]: I0313 01:13:59.931614 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd849\" (UniqueName: \"kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:00.040911 master-0 kubenswrapper[7110]: I0313 01:14:00.040865 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:00.533932 master-0 kubenswrapper[7110]: I0313 01:14:00.533085 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-78885b775b-jrrjv"] Mar 13 01:14:00.620655 master-0 kubenswrapper[7110]: I0313 01:14:00.620273 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" event={"ID":"4c5174b9-ca9e-4917-ab3a-ca403ce4f017","Type":"ContainerStarted","Data":"4d358f5b4f38fb2c37cd9308f0839588cdfa1ee1a0977394634ae2dfe045b4b7"} Mar 13 01:14:00.632173 master-0 kubenswrapper[7110]: I0313 01:14:00.632146 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" event={"ID":"f97819d0-2840-4352-a435-19ef1a8c22c9","Type":"ContainerStarted","Data":"4c5fb20dad21e9e3e37f291a5be6622a0a622dd0a6d9ba5a22b729aeb465b9cc"} Mar 13 01:14:00.634777 master-0 kubenswrapper[7110]: I0313 01:14:00.634525 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" event={"ID":"bbf0bd4d-3387-43c3-b9d5-61a044fa2138","Type":"ContainerStarted","Data":"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22"} Mar 13 01:14:00.635797 master-0 kubenswrapper[7110]: I0313 01:14:00.635769 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerStarted","Data":"bdd5af34bfad236139e626fcebfb16719c123b2551b988ca1c04bcedf0b2fdb1"} Mar 13 01:14:00.638147 master-0 kubenswrapper[7110]: I0313 01:14:00.638089 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268"} Mar 13 01:14:00.648581 master-0 kubenswrapper[7110]: I0313 01:14:00.647885 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"81594a611904e6b6b1a33993523b7420d7c605395323b0ff7a70b4475e6f0b5c"} Mar 13 01:14:00.648581 master-0 kubenswrapper[7110]: I0313 01:14:00.647926 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d"} Mar 13 01:14:00.770959 master-0 kubenswrapper[7110]: I0313 01:14:00.770288 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:14:00.785733 master-0 kubenswrapper[7110]: I0313 01:14:00.785690 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w"] Mar 13 01:14:00.808979 master-0 kubenswrapper[7110]: W0313 01:14:00.808861 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a7d5de_1ca1_46c8_8fbb_f34e4e2358d3.slice/crio-57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9 WatchSource:0}: Error finding container 57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9: Status 404 returned error can't find the container with id 57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9 Mar 13 01:14:00.818822 master-0 kubenswrapper[7110]: I0313 01:14:00.818767 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-9vzj5"] Mar 13 01:14:00.819380 master-0 kubenswrapper[7110]: I0313 01:14:00.819286 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.870527 master-0 kubenswrapper[7110]: I0313 01:14:00.870504 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.870695 master-0 kubenswrapper[7110]: I0313 01:14:00.870680 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.870865 master-0 kubenswrapper[7110]: I0313 01:14:00.870831 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.870992 master-0 kubenswrapper[7110]: I0313 01:14:00.870981 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871083 master-0 kubenswrapper[7110]: I0313 01:14:00.871072 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871196 master-0 kubenswrapper[7110]: I0313 01:14:00.871185 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871290 master-0 kubenswrapper[7110]: I0313 01:14:00.871278 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871385 master-0 kubenswrapper[7110]: I0313 01:14:00.871374 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871486 master-0 kubenswrapper[7110]: I0313 01:14:00.871473 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871580 master-0 kubenswrapper[7110]: I0313 01:14:00.871567 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871696 master-0 kubenswrapper[7110]: I0313 01:14:00.871685 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871784 master-0 kubenswrapper[7110]: I0313 01:14:00.871773 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvtc\" (UniqueName: \"kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871895 master-0 kubenswrapper[7110]: I0313 01:14:00.871884 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.871994 master-0 kubenswrapper[7110]: I0313 01:14:00.871980 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974098 master-0 kubenswrapper[7110]: I0313 01:14:00.974059 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974187 master-0 kubenswrapper[7110]: I0313 01:14:00.974109 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvtc\" (UniqueName: \"kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974187 master-0 kubenswrapper[7110]: I0313 01:14:00.974156 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974187 master-0 kubenswrapper[7110]: I0313 01:14:00.974170 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974187 master-0 kubenswrapper[7110]: I0313 01:14:00.974186 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974297 master-0 kubenswrapper[7110]: I0313 01:14:00.974200 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974297 master-0 kubenswrapper[7110]: I0313 01:14:00.974238 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974297 master-0 kubenswrapper[7110]: I0313 01:14:00.974274 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974297 master-0 kubenswrapper[7110]: I0313 01:14:00.974289 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974406 master-0 kubenswrapper[7110]: I0313 01:14:00.974312 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974406 master-0 kubenswrapper[7110]: I0313 01:14:00.974327 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974406 master-0 kubenswrapper[7110]: I0313 01:14:00.974352 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974406 master-0 kubenswrapper[7110]: I0313 01:14:00.974375 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974406 master-0 kubenswrapper[7110]: I0313 01:14:00.974393 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974547 master-0 kubenswrapper[7110]: I0313 01:14:00.974522 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.974582 master-0 kubenswrapper[7110]: I0313 01:14:00.974565 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.974852 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.974926 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.974954 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.974977 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.975028 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.975625 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.975742 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.975925 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.978015 master-0 kubenswrapper[7110]: I0313 01:14:00.975958 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.982860 master-0 kubenswrapper[7110]: I0313 01:14:00.982837 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.982998 master-0 kubenswrapper[7110]: I0313 01:14:00.982981 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:00.989770 master-0 kubenswrapper[7110]: I0313 01:14:00.989725 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvtc\" (UniqueName: \"kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:01.163385 master-0 kubenswrapper[7110]: I0313 01:14:01.163290 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:14:01.189879 master-0 kubenswrapper[7110]: W0313 01:14:01.189832 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5ab1d5_dabd_45e7_a688_71a282f61e67.slice/crio-36c90a63dbc503a0e102d326d7589d3b1f80da94a54e8faa945f71ea63acc949 WatchSource:0}: Error finding container 36c90a63dbc503a0e102d326d7589d3b1f80da94a54e8faa945f71ea63acc949: Status 404 returned error can't find the container with id 36c90a63dbc503a0e102d326d7589d3b1f80da94a54e8faa945f71ea63acc949 Mar 13 01:14:01.336210 master-0 kubenswrapper[7110]: I0313 01:14:01.336171 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-26mfw"] Mar 13 01:14:01.337031 master-0 kubenswrapper[7110]: I0313 01:14:01.337001 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.339291 master-0 kubenswrapper[7110]: I0313 01:14:01.338840 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 01:14:01.339291 master-0 kubenswrapper[7110]: I0313 01:14:01.339134 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 01:14:01.339291 master-0 kubenswrapper[7110]: I0313 01:14:01.339228 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 01:14:01.339761 master-0 kubenswrapper[7110]: I0313 01:14:01.339729 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 01:14:01.356328 master-0 kubenswrapper[7110]: I0313 01:14:01.356091 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26mfw"] Mar 13 01:14:01.380729 master-0 kubenswrapper[7110]: I0313 01:14:01.379952 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlks\" (UniqueName: \"kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.380729 master-0 kubenswrapper[7110]: I0313 01:14:01.380048 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.380729 master-0 kubenswrapper[7110]: I0313 01:14:01.380083 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.482012 master-0 kubenswrapper[7110]: I0313 01:14:01.481237 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.482012 master-0 kubenswrapper[7110]: I0313 01:14:01.481293 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.482012 master-0 kubenswrapper[7110]: I0313 01:14:01.481351 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlks\" (UniqueName: \"kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.482239 master-0 kubenswrapper[7110]: I0313 01:14:01.482228 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.482733 master-0 kubenswrapper[7110]: E0313 01:14:01.482291 7110 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 13 01:14:01.482733 master-0 kubenswrapper[7110]: E0313 01:14:01.482328 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls podName:58405741-598c-4bf5-bbc8-1ca8e3f10995 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:01.982318236 +0000 UTC m=+43.267344692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls") pod "dns-default-26mfw" (UID: "58405741-598c-4bf5-bbc8-1ca8e3f10995") : secret "dns-default-metrics-tls" not found Mar 13 01:14:01.505594 master-0 kubenswrapper[7110]: I0313 01:14:01.505538 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlks\" (UniqueName: \"kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:01.656360 master-0 kubenswrapper[7110]: I0313 01:14:01.655423 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" event={"ID":"6a5ab1d5-dabd-45e7-a688-71a282f61e67","Type":"ContainerStarted","Data":"01455eeb058734a90460549b36e203977c395814caa7b919b04ec224f499fd04"} Mar 13 01:14:01.656360 master-0 kubenswrapper[7110]: I0313 01:14:01.655476 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" event={"ID":"6a5ab1d5-dabd-45e7-a688-71a282f61e67","Type":"ContainerStarted","Data":"36c90a63dbc503a0e102d326d7589d3b1f80da94a54e8faa945f71ea63acc949"} Mar 13 01:14:01.658664 master-0 kubenswrapper[7110]: I0313 01:14:01.658343 7110 generic.go:334] "Generic (PLEG): container finished" podID="33e195ca-6747-4a22-a0b3-6bf90a06b215" containerID="25412677b650df77d4f3d7e62344cb76bd86be5f42e8a3faa920cee36fb40124" exitCode=0 Mar 13 01:14:01.658664 master-0 kubenswrapper[7110]: I0313 01:14:01.658397 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" event={"ID":"33e195ca-6747-4a22-a0b3-6bf90a06b215","Type":"ContainerDied","Data":"25412677b650df77d4f3d7e62344cb76bd86be5f42e8a3faa920cee36fb40124"} Mar 13 01:14:01.666709 master-0 kubenswrapper[7110]: I0313 01:14:01.666056 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"e57f50a96016f374be74d1fdff3ad902d70b80cfc4848a5c9d8694184d265ad5"} Mar 13 01:14:01.666709 master-0 kubenswrapper[7110]: I0313 01:14:01.666093 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"7519572e1e1852f1b86cc906ba51b3ab5a510ff419648327c2c8d44723336143"} Mar 13 01:14:01.674777 master-0 kubenswrapper[7110]: I0313 01:14:01.673399 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"e7ab97f2a4c561db63cb663455e603d6ca1f98998fa007e41050f6e9e2778659"} Mar 13 01:14:01.678711 master-0 kubenswrapper[7110]: I0313 01:14:01.675489 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"3a529580-5405-4c01-9121-f32104edf52a","Type":"ContainerStarted","Data":"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608"} Mar 13 01:14:01.678711 master-0 kubenswrapper[7110]: I0313 01:14:01.675525 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"3a529580-5405-4c01-9121-f32104edf52a","Type":"ContainerStarted","Data":"0a716477dd56760c3f4676954f02d2bfd3f2125e5a33a03b10ce95d0e8de3b73"} Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.687671 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"c51cdc52a7907953fa7b2f33b8d5256b59068c681441b3e967443332162a6acd"} Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.687712 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b"} Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.687741 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9"} Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.688581 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.691118 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" podStartSLOduration=1.69110399 podStartE2EDuration="1.69110399s" podCreationTimestamp="2026-03-13 01:14:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:01.681016128 +0000 UTC m=+42.966042594" watchObservedRunningTime="2026-03-13 01:14:01.69110399 +0000 UTC m=+42.976130456" Mar 13 01:14:01.691466 master-0 kubenswrapper[7110]: I0313 01:14:01.691407 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lw6xm"] Mar 13 01:14:01.691909 master-0 kubenswrapper[7110]: I0313 01:14:01.691896 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.796731 master-0 kubenswrapper[7110]: I0313 01:14:01.793897 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7wj9\" (UniqueName: \"kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.796731 master-0 kubenswrapper[7110]: I0313 01:14:01.794010 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.796731 master-0 kubenswrapper[7110]: I0313 01:14:01.794215 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=3.794194265 podStartE2EDuration="3.794194265s" podCreationTimestamp="2026-03-13 01:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:01.792960281 +0000 UTC m=+43.077986737" watchObservedRunningTime="2026-03-13 01:14:01.794194265 +0000 UTC m=+43.079220731" Mar 13 01:14:01.896179 master-0 kubenswrapper[7110]: I0313 01:14:01.896126 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wj9\" (UniqueName: \"kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.896336 master-0 kubenswrapper[7110]: I0313 01:14:01.896189 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.896444 master-0 kubenswrapper[7110]: I0313 01:14:01.896418 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.916075 master-0 kubenswrapper[7110]: I0313 01:14:01.916041 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wj9\" (UniqueName: \"kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:01.923591 master-0 kubenswrapper[7110]: I0313 01:14:01.923524 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" podStartSLOduration=7.9235049239999995 podStartE2EDuration="7.923504924s" podCreationTimestamp="2026-03-13 01:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:01.923183865 +0000 UTC m=+43.208210351" watchObservedRunningTime="2026-03-13 01:14:01.923504924 +0000 UTC m=+43.208531390" Mar 13 01:14:01.997870 master-0 kubenswrapper[7110]: I0313 01:14:01.997831 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:14:01.998601 master-0 kubenswrapper[7110]: I0313 01:14:01.998579 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:02.009307 master-0 kubenswrapper[7110]: I0313 01:14:02.009262 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:02.047096 master-0 kubenswrapper[7110]: I0313 01:14:02.047022 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:14:02.061512 master-0 kubenswrapper[7110]: W0313 01:14:02.061467 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd81bcb58_efe3_4577_8e88_67f92c645f6f.slice/crio-b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272 WatchSource:0}: Error finding container b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272: Status 404 returned error can't find the container with id b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272 Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.099538 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.099906 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit" (OuterVolumeSpecName: "audit") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.099951 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.099977 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.100003 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dfc8\" (UniqueName: \"kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.100033 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100037 master-0 kubenswrapper[7110]: I0313 01:14:02.100061 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100468 master-0 kubenswrapper[7110]: I0313 01:14:02.100102 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100468 master-0 kubenswrapper[7110]: I0313 01:14:02.100122 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100468 master-0 kubenswrapper[7110]: I0313 01:14:02.100149 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100468 master-0 kubenswrapper[7110]: I0313 01:14:02.100196 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100468 master-0 kubenswrapper[7110]: I0313 01:14:02.100214 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config\") pod \"33e195ca-6747-4a22-a0b3-6bf90a06b215\" (UID: \"33e195ca-6747-4a22-a0b3-6bf90a06b215\") " Mar 13 01:14:02.100731 master-0 kubenswrapper[7110]: I0313 01:14:02.100578 7110 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.101508 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.102031 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.102123 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.102159 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config" (OuterVolumeSpecName: "config") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.102276 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.102337 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:02.103673 master-0 kubenswrapper[7110]: I0313 01:14:02.103614 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:02.107660 master-0 kubenswrapper[7110]: I0313 01:14:02.104049 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:02.107660 master-0 kubenswrapper[7110]: I0313 01:14:02.104906 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8" (OuterVolumeSpecName: "kube-api-access-6dfc8") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "kube-api-access-6dfc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:02.109656 master-0 kubenswrapper[7110]: I0313 01:14:02.109556 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33e195ca-6747-4a22-a0b3-6bf90a06b215" (UID: "33e195ca-6747-4a22-a0b3-6bf90a06b215"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:02.201752 master-0 kubenswrapper[7110]: I0313 01:14:02.201720 7110 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201752 master-0 kubenswrapper[7110]: I0313 01:14:02.201749 7110 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201759 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dfc8\" (UniqueName: \"kubernetes.io/projected/33e195ca-6747-4a22-a0b3-6bf90a06b215-kube-api-access-6dfc8\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201768 7110 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201777 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201785 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201794 7110 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201802 7110 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33e195ca-6747-4a22-a0b3-6bf90a06b215-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201811 7110 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/33e195ca-6747-4a22-a0b3-6bf90a06b215-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.201868 master-0 kubenswrapper[7110]: I0313 01:14:02.201819 7110 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/33e195ca-6747-4a22-a0b3-6bf90a06b215-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:02.271075 master-0 kubenswrapper[7110]: I0313 01:14:02.271027 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:02.497091 master-0 kubenswrapper[7110]: I0313 01:14:02.495077 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-26mfw"] Mar 13 01:14:02.504376 master-0 kubenswrapper[7110]: I0313 01:14:02.504345 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:14:02.514228 master-0 kubenswrapper[7110]: I0313 01:14:02.514186 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:14:02.534653 master-0 kubenswrapper[7110]: I0313 01:14:02.531325 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:14:02.692014 master-0 kubenswrapper[7110]: I0313 01:14:02.691965 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw6xm" event={"ID":"d81bcb58-efe3-4577-8e88-67f92c645f6f","Type":"ContainerStarted","Data":"097e6bfa9c001716d816897e7296052e0bc1aaa96a6b992e354d945a460533bc"} Mar 13 01:14:02.692014 master-0 kubenswrapper[7110]: I0313 01:14:02.692008 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw6xm" event={"ID":"d81bcb58-efe3-4577-8e88-67f92c645f6f","Type":"ContainerStarted","Data":"b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272"} Mar 13 01:14:02.694134 master-0 kubenswrapper[7110]: I0313 01:14:02.694105 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"2ea2257b817f7a593cf8a5bc18fd54c7de892a301e19617876be4cc31d01237b"} Mar 13 01:14:02.698216 master-0 kubenswrapper[7110]: I0313 01:14:02.695699 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" event={"ID":"33e195ca-6747-4a22-a0b3-6bf90a06b215","Type":"ContainerDied","Data":"51d73c636079d4a0f1079b7f2d1f70e13558638c5bb5766f080dd17dae1e0fb5"} Mar 13 01:14:02.698216 master-0 kubenswrapper[7110]: I0313 01:14:02.695735 7110 scope.go:117] "RemoveContainer" containerID="25412677b650df77d4f3d7e62344cb76bd86be5f42e8a3faa920cee36fb40124" Mar 13 01:14:02.698216 master-0 kubenswrapper[7110]: I0313 01:14:02.695780 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-65c58d4d64-6dpp5" Mar 13 01:14:02.702469 master-0 kubenswrapper[7110]: I0313 01:14:02.702412 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lw6xm" podStartSLOduration=1.702403913 podStartE2EDuration="1.702403913s" podCreationTimestamp="2026-03-13 01:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:02.702031083 +0000 UTC m=+43.987057539" watchObservedRunningTime="2026-03-13 01:14:02.702403913 +0000 UTC m=+43.987430379" Mar 13 01:14:02.722765 master-0 kubenswrapper[7110]: I0313 01:14:02.722707 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j"] Mar 13 01:14:02.755992 master-0 kubenswrapper[7110]: I0313 01:14:02.755902 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-69c74d8d69-jpj8z"] Mar 13 01:14:02.756408 master-0 kubenswrapper[7110]: E0313 01:14:02.756380 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e195ca-6747-4a22-a0b3-6bf90a06b215" containerName="fix-audit-permissions" Mar 13 01:14:02.756498 master-0 kubenswrapper[7110]: I0313 01:14:02.756473 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e195ca-6747-4a22-a0b3-6bf90a06b215" containerName="fix-audit-permissions" Mar 13 01:14:02.756753 master-0 kubenswrapper[7110]: I0313 01:14:02.756728 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e195ca-6747-4a22-a0b3-6bf90a06b215" containerName="fix-audit-permissions" Mar 13 01:14:02.757988 master-0 kubenswrapper[7110]: I0313 01:14:02.757973 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:14:02.758169 master-0 kubenswrapper[7110]: I0313 01:14:02.758158 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.762680 master-0 kubenswrapper[7110]: I0313 01:14:02.762353 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-65c58d4d64-6dpp5"] Mar 13 01:14:02.763065 master-0 kubenswrapper[7110]: I0313 01:14:02.763017 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:14:02.763426 master-0 kubenswrapper[7110]: I0313 01:14:02.763177 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 01:14:02.763426 master-0 kubenswrapper[7110]: I0313 01:14:02.763265 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:14:02.763426 master-0 kubenswrapper[7110]: I0313 01:14:02.763309 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:14:02.763546 master-0 kubenswrapper[7110]: I0313 01:14:02.763507 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:14:02.763574 master-0 kubenswrapper[7110]: I0313 01:14:02.763552 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 01:14:02.763678 master-0 kubenswrapper[7110]: I0313 01:14:02.763662 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:14:02.764872 master-0 kubenswrapper[7110]: I0313 01:14:02.764753 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:14:02.764872 master-0 kubenswrapper[7110]: I0313 01:14:02.764780 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:14:02.774229 master-0 kubenswrapper[7110]: I0313 01:14:02.774185 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-69c74d8d69-jpj8z"] Mar 13 01:14:02.774452 master-0 kubenswrapper[7110]: I0313 01:14:02.774428 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816274 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816310 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6k6\" (UniqueName: \"kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816353 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816521 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816606 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816646 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816668 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816701 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816744 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816774 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.818076 master-0 kubenswrapper[7110]: I0313 01:14:02.816791 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.913659 master-0 kubenswrapper[7110]: I0313 01:14:02.913614 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e195ca-6747-4a22-a0b3-6bf90a06b215" path="/var/lib/kubelet/pods/33e195ca-6747-4a22-a0b3-6bf90a06b215/volumes" Mar 13 01:14:02.917510 master-0 kubenswrapper[7110]: I0313 01:14:02.917465 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917560 master-0 kubenswrapper[7110]: I0313 01:14:02.917515 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917610 master-0 kubenswrapper[7110]: I0313 01:14:02.917559 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917652 master-0 kubenswrapper[7110]: I0313 01:14:02.917605 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917682 master-0 kubenswrapper[7110]: I0313 01:14:02.917651 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917682 master-0 kubenswrapper[7110]: I0313 01:14:02.917676 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917736 master-0 kubenswrapper[7110]: I0313 01:14:02.917713 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6k6\" (UniqueName: \"kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917771 master-0 kubenswrapper[7110]: I0313 01:14:02.917737 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917799 master-0 kubenswrapper[7110]: I0313 01:14:02.917768 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917956 master-0 kubenswrapper[7110]: I0313 01:14:02.917858 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.917956 master-0 kubenswrapper[7110]: I0313 01:14:02.917928 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.918749 master-0 kubenswrapper[7110]: I0313 01:14:02.918721 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.919813 master-0 kubenswrapper[7110]: I0313 01:14:02.919776 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.920511 master-0 kubenswrapper[7110]: I0313 01:14:02.920480 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.920558 master-0 kubenswrapper[7110]: I0313 01:14:02.920527 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.920655 master-0 kubenswrapper[7110]: I0313 01:14:02.920613 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.920755 master-0 kubenswrapper[7110]: I0313 01:14:02.920726 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.920857 master-0 kubenswrapper[7110]: I0313 01:14:02.920825 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.924181 master-0 kubenswrapper[7110]: I0313 01:14:02.924154 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.926601 master-0 kubenswrapper[7110]: I0313 01:14:02.926572 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.926738 master-0 kubenswrapper[7110]: I0313 01:14:02.926713 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:02.934677 master-0 kubenswrapper[7110]: I0313 01:14:02.934620 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6k6\" (UniqueName: \"kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:03.091316 master-0 kubenswrapper[7110]: I0313 01:14:03.091265 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:03.120433 master-0 kubenswrapper[7110]: I0313 01:14:03.120390 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:03.120506 master-0 kubenswrapper[7110]: I0313 01:14:03.120459 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:03.120545 master-0 kubenswrapper[7110]: E0313 01:14:03.120527 7110 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:14:03.120612 master-0 kubenswrapper[7110]: E0313 01:14:03.120589 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca podName:a60370fb-bb70-435c-9c5a-781fa1d63468 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:35.120572147 +0000 UTC m=+76.405598613 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca") pod "route-controller-manager-77c7f858c6-8khnv" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468") : configmap "client-ca" not found Mar 13 01:14:03.123688 master-0 kubenswrapper[7110]: I0313 01:14:03.123661 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"route-controller-manager-77c7f858c6-8khnv\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:03.354708 master-0 kubenswrapper[7110]: I0313 01:14:03.353391 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:03.354708 master-0 kubenswrapper[7110]: I0313 01:14:03.354269 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.358482 master-0 kubenswrapper[7110]: I0313 01:14:03.358447 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:14:03.366526 master-0 kubenswrapper[7110]: I0313 01:14:03.366425 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:03.425155 master-0 kubenswrapper[7110]: I0313 01:14:03.424247 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.425155 master-0 kubenswrapper[7110]: I0313 01:14:03.424435 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.425155 master-0 kubenswrapper[7110]: I0313 01:14:03.424472 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.526870 master-0 kubenswrapper[7110]: I0313 01:14:03.526819 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.526870 master-0 kubenswrapper[7110]: I0313 01:14:03.526868 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.527117 master-0 kubenswrapper[7110]: I0313 01:14:03.526957 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.527117 master-0 kubenswrapper[7110]: I0313 01:14:03.527067 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.531738 master-0 kubenswrapper[7110]: I0313 01:14:03.527419 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.544396 master-0 kubenswrapper[7110]: I0313 01:14:03.544356 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access\") pod \"installer-1-master-0\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.687616 master-0 kubenswrapper[7110]: I0313 01:14:03.687583 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:03.722081 master-0 kubenswrapper[7110]: I0313 01:14:03.721997 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"dafd5daeb5af37ce7ca2009d2230447bde3ffe690f871d4715ff864e7f41bbd6"} Mar 13 01:14:03.722081 master-0 kubenswrapper[7110]: I0313 01:14:03.722032 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"6f8519eea623420c40da808c6cfff53da6452162ecb364a1c82aa4dfe3545fe2"} Mar 13 01:14:03.758908 master-0 kubenswrapper[7110]: I0313 01:14:03.757538 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-69c74d8d69-jpj8z"] Mar 13 01:14:03.925667 master-0 kubenswrapper[7110]: I0313 01:14:03.924573 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:03.998114 master-0 kubenswrapper[7110]: W0313 01:14:03.998071 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1567be70_7890_440e_b5d5_cae3efec8373.slice/crio-26e2e0ea83f6424ae6714b415bd5b2d67ab4f3b4a0b91e6a186a712935f305c2 WatchSource:0}: Error finding container 26e2e0ea83f6424ae6714b415bd5b2d67ab4f3b4a0b91e6a186a712935f305c2: Status 404 returned error can't find the container with id 26e2e0ea83f6424ae6714b415bd5b2d67ab4f3b4a0b91e6a186a712935f305c2 Mar 13 01:14:04.732535 master-0 kubenswrapper[7110]: I0313 01:14:04.732188 7110 generic.go:334] "Generic (PLEG): container finished" podID="57eb2020-1560-4352-8b86-76db59de933a" containerID="408fa86e57d7c0ed7566e66a9206de42b73c3a8d5d5b9b39423211e50e66920f" exitCode=0 Mar 13 01:14:04.732801 master-0 kubenswrapper[7110]: I0313 01:14:04.732320 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerDied","Data":"408fa86e57d7c0ed7566e66a9206de42b73c3a8d5d5b9b39423211e50e66920f"} Mar 13 01:14:04.739771 master-0 kubenswrapper[7110]: I0313 01:14:04.739706 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9"} Mar 13 01:14:04.739885 master-0 kubenswrapper[7110]: I0313 01:14:04.739852 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:14:04.743652 master-0 kubenswrapper[7110]: I0313 01:14:04.742796 7110 generic.go:334] "Generic (PLEG): container finished" podID="738ebdcd-b78b-495a-b8f2-84af11a7d35c" containerID="ec2e5e0e9f2f0d0bb48be3bbf455c597567e5ab58c590a1a48ffa8bb7da7c8c1" exitCode=0 Mar 13 01:14:04.743652 master-0 kubenswrapper[7110]: I0313 01:14:04.742860 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerDied","Data":"ec2e5e0e9f2f0d0bb48be3bbf455c597567e5ab58c590a1a48ffa8bb7da7c8c1"} Mar 13 01:14:04.743652 master-0 kubenswrapper[7110]: I0313 01:14:04.742909 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"b018527cc19e60b658984a3b2cf8d02fa83e221b23e0763c86d4b53c72e80c7e"} Mar 13 01:14:04.747118 master-0 kubenswrapper[7110]: I0313 01:14:04.747044 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"1567be70-7890-440e-b5d5-cae3efec8373","Type":"ContainerStarted","Data":"dd7ffbe9a1f83cc2aab5aa6ed625a109821b43ecbb8c570e367b00a33fc6e7b1"} Mar 13 01:14:04.747243 master-0 kubenswrapper[7110]: I0313 01:14:04.747118 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"1567be70-7890-440e-b5d5-cae3efec8373","Type":"ContainerStarted","Data":"26e2e0ea83f6424ae6714b415bd5b2d67ab4f3b4a0b91e6a186a712935f305c2"} Mar 13 01:14:04.795016 master-0 kubenswrapper[7110]: I0313 01:14:04.794184 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=1.794155428 podStartE2EDuration="1.794155428s" podCreationTimestamp="2026-03-13 01:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:04.793839349 +0000 UTC m=+46.078865845" watchObservedRunningTime="2026-03-13 01:14:04.794155428 +0000 UTC m=+46.079181934" Mar 13 01:14:04.798380 master-0 kubenswrapper[7110]: I0313 01:14:04.798286 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podStartSLOduration=10.798255513 podStartE2EDuration="10.798255513s" podCreationTimestamp="2026-03-13 01:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:04.774431366 +0000 UTC m=+46.059457872" watchObservedRunningTime="2026-03-13 01:14:04.798255513 +0000 UTC m=+46.083282019" Mar 13 01:14:05.667278 master-0 kubenswrapper[7110]: I0313 01:14:05.667225 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:14:05.668221 master-0 kubenswrapper[7110]: I0313 01:14:05.667457 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="3a529580-5405-4c01-9121-f32104edf52a" containerName="installer" containerID="cri-o://e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608" gracePeriod=30 Mar 13 01:14:06.103161 master-0 kubenswrapper[7110]: I0313 01:14:06.103127 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_3a529580-5405-4c01-9121-f32104edf52a/installer/0.log" Mar 13 01:14:06.103295 master-0 kubenswrapper[7110]: I0313 01:14:06.103202 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:14:06.191830 master-0 kubenswrapper[7110]: I0313 01:14:06.191752 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access\") pod \"3a529580-5405-4c01-9121-f32104edf52a\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " Mar 13 01:14:06.191830 master-0 kubenswrapper[7110]: I0313 01:14:06.191806 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock\") pod \"3a529580-5405-4c01-9121-f32104edf52a\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " Mar 13 01:14:06.191830 master-0 kubenswrapper[7110]: I0313 01:14:06.191821 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir\") pod \"3a529580-5405-4c01-9121-f32104edf52a\" (UID: \"3a529580-5405-4c01-9121-f32104edf52a\") " Mar 13 01:14:06.192178 master-0 kubenswrapper[7110]: I0313 01:14:06.192151 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3a529580-5405-4c01-9121-f32104edf52a" (UID: "3a529580-5405-4c01-9121-f32104edf52a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:06.192219 master-0 kubenswrapper[7110]: I0313 01:14:06.192185 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock" (OuterVolumeSpecName: "var-lock") pod "3a529580-5405-4c01-9121-f32104edf52a" (UID: "3a529580-5405-4c01-9121-f32104edf52a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:06.197617 master-0 kubenswrapper[7110]: I0313 01:14:06.197573 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3a529580-5405-4c01-9121-f32104edf52a" (UID: "3a529580-5405-4c01-9121-f32104edf52a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:06.293658 master-0 kubenswrapper[7110]: I0313 01:14:06.293603 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a529580-5405-4c01-9121-f32104edf52a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:06.293658 master-0 kubenswrapper[7110]: I0313 01:14:06.293659 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:06.293658 master-0 kubenswrapper[7110]: I0313 01:14:06.293671 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a529580-5405-4c01-9121-f32104edf52a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:06.597797 master-0 kubenswrapper[7110]: I0313 01:14:06.597651 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") pod \"controller-manager-748d7f7c46-r6nmm\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:14:06.597797 master-0 kubenswrapper[7110]: E0313 01:14:06.597798 7110 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 01:14:06.598058 master-0 kubenswrapper[7110]: E0313 01:14:06.597852 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca podName:2bfe96f2-43a9-47cf-9020-918438eb1ae0 nodeName:}" failed. No retries permitted until 2026-03-13 01:14:22.597837929 +0000 UTC m=+63.882864395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca") pod "controller-manager-748d7f7c46-r6nmm" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0") : configmap "client-ca" not found Mar 13 01:14:06.628441 master-0 kubenswrapper[7110]: I0313 01:14:06.628384 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:14:06.759415 master-0 kubenswrapper[7110]: I0313 01:14:06.759345 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"9633e444d804baebdb37261934de23f2bd534d4b2872dabf85f686f775c2846b"} Mar 13 01:14:06.759415 master-0 kubenswrapper[7110]: I0313 01:14:06.759401 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"a2974c6f36c1f49a4f8394ff7b23640dbad58229e8102c2955715ea80ccef7a7"} Mar 13 01:14:06.759977 master-0 kubenswrapper[7110]: I0313 01:14:06.759805 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:06.761326 master-0 kubenswrapper[7110]: I0313 01:14:06.761268 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerStarted","Data":"caed5b6f3c3e0f496672dafebe4cf87db5086e2ed7b7df39114a4e1a8f3fa33f"} Mar 13 01:14:06.763383 master-0 kubenswrapper[7110]: I0313 01:14:06.763351 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_3a529580-5405-4c01-9121-f32104edf52a/installer/0.log" Mar 13 01:14:06.763458 master-0 kubenswrapper[7110]: I0313 01:14:06.763394 7110 generic.go:334] "Generic (PLEG): container finished" podID="3a529580-5405-4c01-9121-f32104edf52a" containerID="e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608" exitCode=1 Mar 13 01:14:06.763458 master-0 kubenswrapper[7110]: I0313 01:14:06.763446 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"3a529580-5405-4c01-9121-f32104edf52a","Type":"ContainerDied","Data":"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608"} Mar 13 01:14:06.763535 master-0 kubenswrapper[7110]: I0313 01:14:06.763470 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"3a529580-5405-4c01-9121-f32104edf52a","Type":"ContainerDied","Data":"0a716477dd56760c3f4676954f02d2bfd3f2125e5a33a03b10ce95d0e8de3b73"} Mar 13 01:14:06.763535 master-0 kubenswrapper[7110]: I0313 01:14:06.763489 7110 scope.go:117] "RemoveContainer" containerID="e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608" Mar 13 01:14:06.763617 master-0 kubenswrapper[7110]: I0313 01:14:06.763576 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 01:14:06.781300 master-0 kubenswrapper[7110]: I0313 01:14:06.781249 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"bd55753a9d9c25937c1f18505f16f833ab2a86de0b3ecfe2d2a1fb87bd966bf3"} Mar 13 01:14:06.781300 master-0 kubenswrapper[7110]: I0313 01:14:06.781294 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"f9738749750d4de58ccf5e1bc39bada13e04d80581cbfe197867d27ee7a8ad9f"} Mar 13 01:14:06.784314 master-0 kubenswrapper[7110]: I0313 01:14:06.784162 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-26mfw" podStartSLOduration=2.543324759 podStartE2EDuration="5.784145003s" podCreationTimestamp="2026-03-13 01:14:01 +0000 UTC" firstStartedPulling="2026-03-13 01:14:02.50792643 +0000 UTC m=+43.792952896" lastFinishedPulling="2026-03-13 01:14:05.748746674 +0000 UTC m=+47.033773140" observedRunningTime="2026-03-13 01:14:06.782401504 +0000 UTC m=+48.067428010" watchObservedRunningTime="2026-03-13 01:14:06.784145003 +0000 UTC m=+48.069171479" Mar 13 01:14:06.797853 master-0 kubenswrapper[7110]: I0313 01:14:06.796945 7110 scope.go:117] "RemoveContainer" containerID="e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608" Mar 13 01:14:06.797853 master-0 kubenswrapper[7110]: E0313 01:14:06.797496 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608\": container with ID starting with e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608 not found: ID does not exist" containerID="e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608" Mar 13 01:14:06.797853 master-0 kubenswrapper[7110]: I0313 01:14:06.797541 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608"} err="failed to get container status \"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608\": rpc error: code = NotFound desc = could not find container \"e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608\": container with ID starting with e3035386c09a720814e9d83127713eb5261d4f71235d7fec001cc94b709e5608 not found: ID does not exist" Mar 13 01:14:06.810666 master-0 kubenswrapper[7110]: I0313 01:14:06.807543 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" podStartSLOduration=18.807523367 podStartE2EDuration="18.807523367s" podCreationTimestamp="2026-03-13 01:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:06.805976444 +0000 UTC m=+48.091002970" watchObservedRunningTime="2026-03-13 01:14:06.807523367 +0000 UTC m=+48.092549833" Mar 13 01:14:06.825334 master-0 kubenswrapper[7110]: I0313 01:14:06.825269 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:14:06.836524 master-0 kubenswrapper[7110]: I0313 01:14:06.836392 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 01:14:06.864154 master-0 kubenswrapper[7110]: I0313 01:14:06.863195 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" podStartSLOduration=4.806245667 podStartE2EDuration="7.863178405s" podCreationTimestamp="2026-03-13 01:13:59 +0000 UTC" firstStartedPulling="2026-03-13 01:14:00.56636049 +0000 UTC m=+41.851386956" lastFinishedPulling="2026-03-13 01:14:03.623293228 +0000 UTC m=+44.908319694" observedRunningTime="2026-03-13 01:14:06.858993918 +0000 UTC m=+48.144020384" watchObservedRunningTime="2026-03-13 01:14:06.863178405 +0000 UTC m=+48.148204871" Mar 13 01:14:06.882745 master-0 kubenswrapper[7110]: I0313 01:14:06.881783 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748d7f7c46-r6nmm"] Mar 13 01:14:06.882745 master-0 kubenswrapper[7110]: E0313 01:14:06.882200 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" podUID="2bfe96f2-43a9-47cf-9020-918438eb1ae0" Mar 13 01:14:06.905423 master-0 kubenswrapper[7110]: I0313 01:14:06.904050 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv"] Mar 13 01:14:06.905423 master-0 kubenswrapper[7110]: E0313 01:14:06.904443 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" podUID="a60370fb-bb70-435c-9c5a-781fa1d63468" Mar 13 01:14:06.916811 master-0 kubenswrapper[7110]: I0313 01:14:06.916767 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a529580-5405-4c01-9121-f32104edf52a" path="/var/lib/kubelet/pods/3a529580-5405-4c01-9121-f32104edf52a/volumes" Mar 13 01:14:07.787776 master-0 kubenswrapper[7110]: I0313 01:14:07.787735 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:07.788759 master-0 kubenswrapper[7110]: I0313 01:14:07.788728 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:14:07.795846 master-0 kubenswrapper[7110]: I0313 01:14:07.795804 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:07.801395 master-0 kubenswrapper[7110]: I0313 01:14:07.801357 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:14:07.916322 master-0 kubenswrapper[7110]: I0313 01:14:07.916271 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") pod \"a60370fb-bb70-435c-9c5a-781fa1d63468\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " Mar 13 01:14:07.916708 master-0 kubenswrapper[7110]: I0313 01:14:07.916679 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config\") pod \"a60370fb-bb70-435c-9c5a-781fa1d63468\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " Mar 13 01:14:07.916871 master-0 kubenswrapper[7110]: I0313 01:14:07.916849 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config\") pod \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " Mar 13 01:14:07.917044 master-0 kubenswrapper[7110]: I0313 01:14:07.917016 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5fz\" (UniqueName: \"kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz\") pod \"a60370fb-bb70-435c-9c5a-781fa1d63468\" (UID: \"a60370fb-bb70-435c-9c5a-781fa1d63468\") " Mar 13 01:14:07.917212 master-0 kubenswrapper[7110]: I0313 01:14:07.917188 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert\") pod \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " Mar 13 01:14:07.917547 master-0 kubenswrapper[7110]: I0313 01:14:07.917523 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles\") pod \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " Mar 13 01:14:07.917718 master-0 kubenswrapper[7110]: I0313 01:14:07.917694 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c7mk\" (UniqueName: \"kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk\") pod \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\" (UID: \"2bfe96f2-43a9-47cf-9020-918438eb1ae0\") " Mar 13 01:14:07.917896 master-0 kubenswrapper[7110]: I0313 01:14:07.917707 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config" (OuterVolumeSpecName: "config") pod "a60370fb-bb70-435c-9c5a-781fa1d63468" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:07.918035 master-0 kubenswrapper[7110]: I0313 01:14:07.917815 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config" (OuterVolumeSpecName: "config") pod "2bfe96f2-43a9-47cf-9020-918438eb1ae0" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:07.918357 master-0 kubenswrapper[7110]: I0313 01:14:07.918305 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2bfe96f2-43a9-47cf-9020-918438eb1ae0" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:07.919396 master-0 kubenswrapper[7110]: I0313 01:14:07.919374 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:07.919597 master-0 kubenswrapper[7110]: I0313 01:14:07.919581 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:07.919720 master-0 kubenswrapper[7110]: I0313 01:14:07.919701 7110 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:07.920747 master-0 kubenswrapper[7110]: I0313 01:14:07.920709 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2bfe96f2-43a9-47cf-9020-918438eb1ae0" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:07.922063 master-0 kubenswrapper[7110]: I0313 01:14:07.921994 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz" (OuterVolumeSpecName: "kube-api-access-qv5fz") pod "a60370fb-bb70-435c-9c5a-781fa1d63468" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468"). InnerVolumeSpecName "kube-api-access-qv5fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:07.922063 master-0 kubenswrapper[7110]: I0313 01:14:07.922009 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk" (OuterVolumeSpecName: "kube-api-access-7c7mk") pod "2bfe96f2-43a9-47cf-9020-918438eb1ae0" (UID: "2bfe96f2-43a9-47cf-9020-918438eb1ae0"). InnerVolumeSpecName "kube-api-access-7c7mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:07.923141 master-0 kubenswrapper[7110]: I0313 01:14:07.923097 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a60370fb-bb70-435c-9c5a-781fa1d63468" (UID: "a60370fb-bb70-435c-9c5a-781fa1d63468"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:08.020647 master-0 kubenswrapper[7110]: I0313 01:14:08.020585 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60370fb-bb70-435c-9c5a-781fa1d63468-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:08.020803 master-0 kubenswrapper[7110]: I0313 01:14:08.020682 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5fz\" (UniqueName: \"kubernetes.io/projected/a60370fb-bb70-435c-9c5a-781fa1d63468-kube-api-access-qv5fz\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:08.020803 master-0 kubenswrapper[7110]: I0313 01:14:08.020704 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bfe96f2-43a9-47cf-9020-918438eb1ae0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:08.020803 master-0 kubenswrapper[7110]: I0313 01:14:08.020724 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c7mk\" (UniqueName: \"kubernetes.io/projected/2bfe96f2-43a9-47cf-9020-918438eb1ae0-kube-api-access-7c7mk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:08.086580 master-0 kubenswrapper[7110]: I0313 01:14:08.086439 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 01:14:08.086751 master-0 kubenswrapper[7110]: E0313 01:14:08.086735 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a529580-5405-4c01-9121-f32104edf52a" containerName="installer" Mar 13 01:14:08.086783 master-0 kubenswrapper[7110]: I0313 01:14:08.086754 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a529580-5405-4c01-9121-f32104edf52a" containerName="installer" Mar 13 01:14:08.086929 master-0 kubenswrapper[7110]: I0313 01:14:08.086900 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a529580-5405-4c01-9121-f32104edf52a" containerName="installer" Mar 13 01:14:08.087374 master-0 kubenswrapper[7110]: I0313 01:14:08.087342 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.090306 master-0 kubenswrapper[7110]: I0313 01:14:08.090232 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 01:14:08.094810 master-0 kubenswrapper[7110]: I0313 01:14:08.094751 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:08.094951 master-0 kubenswrapper[7110]: I0313 01:14:08.094831 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:08.099439 master-0 kubenswrapper[7110]: I0313 01:14:08.099376 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: I0313 01:14:08.109108 7110 patch_prober.go:28] interesting pod/apiserver-69c74d8d69-jpj8z container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]log ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]etcd ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/max-in-flight-filter ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/openshift.io-startinformers ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 01:14:08.109172 master-0 kubenswrapper[7110]: livez check failed Mar 13 01:14:08.109838 master-0 kubenswrapper[7110]: I0313 01:14:08.109194 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" podUID="738ebdcd-b78b-495a-b8f2-84af11a7d35c" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:14:08.223466 master-0 kubenswrapper[7110]: I0313 01:14:08.223403 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.223729 master-0 kubenswrapper[7110]: I0313 01:14:08.223479 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.223856 master-0 kubenswrapper[7110]: I0313 01:14:08.223792 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.260720 master-0 kubenswrapper[7110]: I0313 01:14:08.260670 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:08.261497 master-0 kubenswrapper[7110]: I0313 01:14:08.261471 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.273250 master-0 kubenswrapper[7110]: I0313 01:14:08.273184 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:08.324779 master-0 kubenswrapper[7110]: I0313 01:14:08.324737 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.324878 master-0 kubenswrapper[7110]: I0313 01:14:08.324809 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.324944 master-0 kubenswrapper[7110]: I0313 01:14:08.324917 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.325083 master-0 kubenswrapper[7110]: I0313 01:14:08.325055 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.325190 master-0 kubenswrapper[7110]: I0313 01:14:08.325176 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.325353 master-0 kubenswrapper[7110]: I0313 01:14:08.325314 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.325495 master-0 kubenswrapper[7110]: I0313 01:14:08.325479 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.325761 master-0 kubenswrapper[7110]: I0313 01:14:08.325713 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.348691 master-0 kubenswrapper[7110]: I0313 01:14:08.348560 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.427333 master-0 kubenswrapper[7110]: I0313 01:14:08.427211 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.427547 master-0 kubenswrapper[7110]: I0313 01:14:08.427423 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.427547 master-0 kubenswrapper[7110]: I0313 01:14:08.427531 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.427661 master-0 kubenswrapper[7110]: I0313 01:14:08.427617 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.429989 master-0 kubenswrapper[7110]: I0313 01:14:08.427880 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.433760 master-0 kubenswrapper[7110]: I0313 01:14:08.430171 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:08.469150 master-0 kubenswrapper[7110]: I0313 01:14:08.469085 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access\") pod \"installer-3-master-0\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.650542 master-0 kubenswrapper[7110]: I0313 01:14:08.647037 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:08.752503 master-0 kubenswrapper[7110]: I0313 01:14:08.752450 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 01:14:08.759803 master-0 kubenswrapper[7110]: W0313 01:14:08.759738 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod36b2d6ee_3ae7_444b_b327_f024a8a06ab7.slice/crio-ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111 WatchSource:0}: Error finding container ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111: Status 404 returned error can't find the container with id ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111 Mar 13 01:14:08.793688 master-0 kubenswrapper[7110]: I0313 01:14:08.793575 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"36b2d6ee-3ae7-444b-b327-f024a8a06ab7","Type":"ContainerStarted","Data":"ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111"} Mar 13 01:14:08.793688 master-0 kubenswrapper[7110]: I0313 01:14:08.793598 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv" Mar 13 01:14:08.794036 master-0 kubenswrapper[7110]: I0313 01:14:08.793605 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748d7f7c46-r6nmm" Mar 13 01:14:08.846768 master-0 kubenswrapper[7110]: I0313 01:14:08.846723 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:08.847617 master-0 kubenswrapper[7110]: I0313 01:14:08.847336 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748d7f7c46-r6nmm"] Mar 13 01:14:08.847617 master-0 kubenswrapper[7110]: I0313 01:14:08.847425 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.860056 master-0 kubenswrapper[7110]: I0313 01:14:08.859313 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:14:08.860056 master-0 kubenswrapper[7110]: I0313 01:14:08.859595 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:14:08.860056 master-0 kubenswrapper[7110]: I0313 01:14:08.859754 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:14:08.860056 master-0 kubenswrapper[7110]: I0313 01:14:08.859916 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:14:08.860056 master-0 kubenswrapper[7110]: I0313 01:14:08.859929 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:14:08.863298 master-0 kubenswrapper[7110]: I0313 01:14:08.863250 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-748d7f7c46-r6nmm"] Mar 13 01:14:08.870701 master-0 kubenswrapper[7110]: I0313 01:14:08.869619 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:08.875306 master-0 kubenswrapper[7110]: I0313 01:14:08.875251 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:14:08.925372 master-0 kubenswrapper[7110]: I0313 01:14:08.924541 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfe96f2-43a9-47cf-9020-918438eb1ae0" path="/var/lib/kubelet/pods/2bfe96f2-43a9-47cf-9020-918438eb1ae0/volumes" Mar 13 01:14:08.925934 master-0 kubenswrapper[7110]: I0313 01:14:08.925781 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv"] Mar 13 01:14:08.925934 master-0 kubenswrapper[7110]: I0313 01:14:08.925817 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:08.925934 master-0 kubenswrapper[7110]: I0313 01:14:08.925831 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77c7f858c6-8khnv"] Mar 13 01:14:08.933931 master-0 kubenswrapper[7110]: I0313 01:14:08.933890 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.934172 master-0 kubenswrapper[7110]: I0313 01:14:08.933947 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrcn\" (UniqueName: \"kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.935068 master-0 kubenswrapper[7110]: I0313 01:14:08.934248 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.935068 master-0 kubenswrapper[7110]: I0313 01:14:08.934689 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.935180 master-0 kubenswrapper[7110]: I0313 01:14:08.935136 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:08.935262 master-0 kubenswrapper[7110]: I0313 01:14:08.935226 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bfe96f2-43a9-47cf-9020-918438eb1ae0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:09.036559 master-0 kubenswrapper[7110]: I0313 01:14:09.036284 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.036559 master-0 kubenswrapper[7110]: I0313 01:14:09.036505 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.036559 master-0 kubenswrapper[7110]: I0313 01:14:09.036525 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrcn\" (UniqueName: \"kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.036820 master-0 kubenswrapper[7110]: I0313 01:14:09.036576 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.036820 master-0 kubenswrapper[7110]: I0313 01:14:09.036600 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.036820 master-0 kubenswrapper[7110]: I0313 01:14:09.036663 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60370fb-bb70-435c-9c5a-781fa1d63468-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:09.038451 master-0 kubenswrapper[7110]: I0313 01:14:09.038415 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.038561 master-0 kubenswrapper[7110]: I0313 01:14:09.038531 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.038696 master-0 kubenswrapper[7110]: I0313 01:14:09.038670 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.040834 master-0 kubenswrapper[7110]: I0313 01:14:09.040808 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.058522 master-0 kubenswrapper[7110]: I0313 01:14:09.058467 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrcn\" (UniqueName: \"kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn\") pod \"controller-manager-6cc877748f-cvjwm\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.174404 master-0 kubenswrapper[7110]: I0313 01:14:09.174339 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:09.579776 master-0 kubenswrapper[7110]: I0313 01:14:09.579699 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:09.588236 master-0 kubenswrapper[7110]: W0313 01:14:09.588154 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd9c693_1890_427b_a011_7a3cc5f04bb5.slice/crio-b48dbccd168c65099a9c5c6976a0c0e5771f2e97e68c7ababb65981cfa725005 WatchSource:0}: Error finding container b48dbccd168c65099a9c5c6976a0c0e5771f2e97e68c7ababb65981cfa725005: Status 404 returned error can't find the container with id b48dbccd168c65099a9c5c6976a0c0e5771f2e97e68c7ababb65981cfa725005 Mar 13 01:14:09.809244 master-0 kubenswrapper[7110]: I0313 01:14:09.809166 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" event={"ID":"2dd9c693-1890-427b-a011-7a3cc5f04bb5","Type":"ContainerStarted","Data":"b48dbccd168c65099a9c5c6976a0c0e5771f2e97e68c7ababb65981cfa725005"} Mar 13 01:14:09.811339 master-0 kubenswrapper[7110]: I0313 01:14:09.811291 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"22711394-3d1a-4e8d-931b-5c3b23d89519","Type":"ContainerStarted","Data":"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94"} Mar 13 01:14:09.811339 master-0 kubenswrapper[7110]: I0313 01:14:09.811327 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"22711394-3d1a-4e8d-931b-5c3b23d89519","Type":"ContainerStarted","Data":"b5e4ebe6aed962e2cf987dbe83f5e029fc26156301238d9d1f2d7caf40c7d55b"} Mar 13 01:14:09.813696 master-0 kubenswrapper[7110]: I0313 01:14:09.813590 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"36b2d6ee-3ae7-444b-b327-f024a8a06ab7","Type":"ContainerStarted","Data":"e8004c606441f404a88824ad0f391a1736bc2bfc8d968e181bc7750c3498d909"} Mar 13 01:14:09.829330 master-0 kubenswrapper[7110]: I0313 01:14:09.829156 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=1.829131906 podStartE2EDuration="1.829131906s" podCreationTimestamp="2026-03-13 01:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:09.827755177 +0000 UTC m=+51.112781673" watchObservedRunningTime="2026-03-13 01:14:09.829131906 +0000 UTC m=+51.114158402" Mar 13 01:14:09.863681 master-0 kubenswrapper[7110]: I0313 01:14:09.863502 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=1.863475657 podStartE2EDuration="1.863475657s" podCreationTimestamp="2026-03-13 01:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:09.858880628 +0000 UTC m=+51.143907134" watchObservedRunningTime="2026-03-13 01:14:09.863475657 +0000 UTC m=+51.148502163" Mar 13 01:14:10.041591 master-0 kubenswrapper[7110]: I0313 01:14:10.041524 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:10.041816 master-0 kubenswrapper[7110]: I0313 01:14:10.041778 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:10.051834 master-0 kubenswrapper[7110]: I0313 01:14:10.051776 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:10.826044 master-0 kubenswrapper[7110]: I0313 01:14:10.825994 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:14:10.929175 master-0 kubenswrapper[7110]: I0313 01:14:10.925946 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60370fb-bb70-435c-9c5a-781fa1d63468" path="/var/lib/kubelet/pods/a60370fb-bb70-435c-9c5a-781fa1d63468/volumes" Mar 13 01:14:11.695175 master-0 kubenswrapper[7110]: I0313 01:14:11.695085 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:11.696348 master-0 kubenswrapper[7110]: I0313 01:14:11.696300 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.700906 master-0 kubenswrapper[7110]: I0313 01:14:11.699455 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:14:11.702253 master-0 kubenswrapper[7110]: I0313 01:14:11.702088 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:14:11.703757 master-0 kubenswrapper[7110]: I0313 01:14:11.703556 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:14:11.703757 master-0 kubenswrapper[7110]: I0313 01:14:11.703740 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:14:11.704283 master-0 kubenswrapper[7110]: I0313 01:14:11.704033 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:14:11.774103 master-0 kubenswrapper[7110]: I0313 01:14:11.774023 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgpx\" (UniqueName: \"kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.774296 master-0 kubenswrapper[7110]: I0313 01:14:11.774105 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.774496 master-0 kubenswrapper[7110]: I0313 01:14:11.774404 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.774730 master-0 kubenswrapper[7110]: I0313 01:14:11.774624 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.875922 master-0 kubenswrapper[7110]: I0313 01:14:11.875858 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.876457 master-0 kubenswrapper[7110]: I0313 01:14:11.876294 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgpx\" (UniqueName: \"kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.876809 master-0 kubenswrapper[7110]: I0313 01:14:11.876577 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.877064 master-0 kubenswrapper[7110]: I0313 01:14:11.876845 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.877064 master-0 kubenswrapper[7110]: I0313 01:14:11.876965 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.877672 master-0 kubenswrapper[7110]: I0313 01:14:11.877595 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.881679 master-0 kubenswrapper[7110]: I0313 01:14:11.881284 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.934667 master-0 kubenswrapper[7110]: I0313 01:14:11.933833 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgpx\" (UniqueName: \"kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx\") pod \"route-controller-manager-d4d56c4b7-ndd42\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:11.936006 master-0 kubenswrapper[7110]: I0313 01:14:11.935967 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:12.039163 master-0 kubenswrapper[7110]: I0313 01:14:12.039046 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:12.534338 master-0 kubenswrapper[7110]: I0313 01:14:12.534249 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:14:13.043200 master-0 kubenswrapper[7110]: I0313 01:14:13.043127 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:13.059483 master-0 kubenswrapper[7110]: W0313 01:14:13.059432 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e864dcb_fec1_47a6_9ce9_2bd2541aeb46.slice/crio-a0846383dbb010c8f2d8bcb3c1991688b6d64fdcfd195932c43a39ad6e3d6016 WatchSource:0}: Error finding container a0846383dbb010c8f2d8bcb3c1991688b6d64fdcfd195932c43a39ad6e3d6016: Status 404 returned error can't find the container with id a0846383dbb010c8f2d8bcb3c1991688b6d64fdcfd195932c43a39ad6e3d6016 Mar 13 01:14:13.102025 master-0 kubenswrapper[7110]: I0313 01:14:13.101963 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:13.111169 master-0 kubenswrapper[7110]: I0313 01:14:13.111137 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:14:13.836126 master-0 kubenswrapper[7110]: I0313 01:14:13.836057 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" event={"ID":"2dd9c693-1890-427b-a011-7a3cc5f04bb5","Type":"ContainerStarted","Data":"cd4a7d811e9bd6dc45032d0de284983ff5924aa9cdac4cfdf6e3b3750db023a5"} Mar 13 01:14:13.837683 master-0 kubenswrapper[7110]: I0313 01:14:13.836819 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:13.838331 master-0 kubenswrapper[7110]: I0313 01:14:13.838306 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" event={"ID":"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46","Type":"ContainerStarted","Data":"a0846383dbb010c8f2d8bcb3c1991688b6d64fdcfd195932c43a39ad6e3d6016"} Mar 13 01:14:13.841503 master-0 kubenswrapper[7110]: I0313 01:14:13.841021 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:13.858570 master-0 kubenswrapper[7110]: I0313 01:14:13.858516 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" podStartSLOduration=4.606875444 podStartE2EDuration="7.85848568s" podCreationTimestamp="2026-03-13 01:14:06 +0000 UTC" firstStartedPulling="2026-03-13 01:14:09.590279311 +0000 UTC m=+50.875305777" lastFinishedPulling="2026-03-13 01:14:12.841889547 +0000 UTC m=+54.126916013" observedRunningTime="2026-03-13 01:14:13.85632664 +0000 UTC m=+55.141353106" watchObservedRunningTime="2026-03-13 01:14:13.85848568 +0000 UTC m=+55.143512136" Mar 13 01:14:14.981772 master-0 kubenswrapper[7110]: I0313 01:14:14.981079 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt"] Mar 13 01:14:14.981772 master-0 kubenswrapper[7110]: I0313 01:14:14.981269 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" podUID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" containerName="cluster-version-operator" containerID="cri-o://18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22" gracePeriod=130 Mar 13 01:14:15.679141 master-0 kubenswrapper[7110]: I0313 01:14:15.679061 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:14:15.856383 master-0 kubenswrapper[7110]: I0313 01:14:15.856318 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") pod \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " Mar 13 01:14:15.856383 master-0 kubenswrapper[7110]: I0313 01:14:15.856389 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") pod \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " Mar 13 01:14:15.856614 master-0 kubenswrapper[7110]: I0313 01:14:15.856437 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") pod \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " Mar 13 01:14:15.856614 master-0 kubenswrapper[7110]: I0313 01:14:15.856435 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "bbf0bd4d-3387-43c3-b9d5-61a044fa2138" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:15.856614 master-0 kubenswrapper[7110]: I0313 01:14:15.856510 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") pod \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " Mar 13 01:14:15.856614 master-0 kubenswrapper[7110]: I0313 01:14:15.856560 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") pod \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\" (UID: \"bbf0bd4d-3387-43c3-b9d5-61a044fa2138\") " Mar 13 01:14:15.856812 master-0 kubenswrapper[7110]: I0313 01:14:15.856663 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "bbf0bd4d-3387-43c3-b9d5-61a044fa2138" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:15.857029 master-0 kubenswrapper[7110]: I0313 01:14:15.856890 7110 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:15.857029 master-0 kubenswrapper[7110]: I0313 01:14:15.856919 7110 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:15.857029 master-0 kubenswrapper[7110]: I0313 01:14:15.856988 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca" (OuterVolumeSpecName: "service-ca") pod "bbf0bd4d-3387-43c3-b9d5-61a044fa2138" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:15.858841 master-0 kubenswrapper[7110]: I0313 01:14:15.858808 7110 generic.go:334] "Generic (PLEG): container finished" podID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" containerID="18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22" exitCode=0 Mar 13 01:14:15.858916 master-0 kubenswrapper[7110]: I0313 01:14:15.858847 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" event={"ID":"bbf0bd4d-3387-43c3-b9d5-61a044fa2138","Type":"ContainerDied","Data":"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22"} Mar 13 01:14:15.858916 master-0 kubenswrapper[7110]: I0313 01:14:15.858865 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" Mar 13 01:14:15.858916 master-0 kubenswrapper[7110]: I0313 01:14:15.858886 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt" event={"ID":"bbf0bd4d-3387-43c3-b9d5-61a044fa2138","Type":"ContainerDied","Data":"09ec7b3f0f3ef6c6bfb61d7ac2e6b3febf7fbb4a21a9eaef4ee580930e6e25c7"} Mar 13 01:14:15.858916 master-0 kubenswrapper[7110]: I0313 01:14:15.858908 7110 scope.go:117] "RemoveContainer" containerID="18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22" Mar 13 01:14:15.860645 master-0 kubenswrapper[7110]: I0313 01:14:15.860592 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bbf0bd4d-3387-43c3-b9d5-61a044fa2138" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:15.861567 master-0 kubenswrapper[7110]: I0313 01:14:15.861517 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bbf0bd4d-3387-43c3-b9d5-61a044fa2138" (UID: "bbf0bd4d-3387-43c3-b9d5-61a044fa2138"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:15.889490 master-0 kubenswrapper[7110]: I0313 01:14:15.889450 7110 scope.go:117] "RemoveContainer" containerID="18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22" Mar 13 01:14:15.891345 master-0 kubenswrapper[7110]: E0313 01:14:15.891088 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22\": container with ID starting with 18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22 not found: ID does not exist" containerID="18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22" Mar 13 01:14:15.891345 master-0 kubenswrapper[7110]: I0313 01:14:15.891226 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22"} err="failed to get container status \"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22\": rpc error: code = NotFound desc = could not find container \"18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22\": container with ID starting with 18f845c6faed83ec932443cccd37f1bfcb02c26f4b7370e1ab46cd6e5a2c2f22 not found: ID does not exist" Mar 13 01:14:15.958560 master-0 kubenswrapper[7110]: I0313 01:14:15.958448 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:15.958560 master-0 kubenswrapper[7110]: I0313 01:14:15.958541 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:15.958822 master-0 kubenswrapper[7110]: I0313 01:14:15.958554 7110 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bbf0bd4d-3387-43c3-b9d5-61a044fa2138-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:16.185797 master-0 kubenswrapper[7110]: I0313 01:14:16.185757 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt"] Mar 13 01:14:16.188300 master-0 kubenswrapper[7110]: I0313 01:14:16.188252 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-zc6gt"] Mar 13 01:14:16.214238 master-0 kubenswrapper[7110]: I0313 01:14:16.214103 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw"] Mar 13 01:14:16.214453 master-0 kubenswrapper[7110]: E0313 01:14:16.214271 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" containerName="cluster-version-operator" Mar 13 01:14:16.214453 master-0 kubenswrapper[7110]: I0313 01:14:16.214283 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" containerName="cluster-version-operator" Mar 13 01:14:16.214453 master-0 kubenswrapper[7110]: I0313 01:14:16.214363 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" containerName="cluster-version-operator" Mar 13 01:14:16.214690 master-0 kubenswrapper[7110]: I0313 01:14:16.214661 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.217345 master-0 kubenswrapper[7110]: I0313 01:14:16.217298 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:14:16.217486 master-0 kubenswrapper[7110]: I0313 01:14:16.217426 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:14:16.217557 master-0 kubenswrapper[7110]: I0313 01:14:16.217524 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-5qbrz" Mar 13 01:14:16.217768 master-0 kubenswrapper[7110]: I0313 01:14:16.217677 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:14:16.260727 master-0 kubenswrapper[7110]: I0313 01:14:16.260675 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:16.260924 master-0 kubenswrapper[7110]: I0313 01:14:16.260887 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="22711394-3d1a-4e8d-931b-5c3b23d89519" containerName="installer" containerID="cri-o://a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94" gracePeriod=30 Mar 13 01:14:16.362428 master-0 kubenswrapper[7110]: I0313 01:14:16.362357 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.363092 master-0 kubenswrapper[7110]: I0313 01:14:16.362474 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.363092 master-0 kubenswrapper[7110]: I0313 01:14:16.362526 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.363092 master-0 kubenswrapper[7110]: I0313 01:14:16.362551 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.363092 master-0 kubenswrapper[7110]: I0313 01:14:16.362663 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463495 master-0 kubenswrapper[7110]: I0313 01:14:16.463453 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463495 master-0 kubenswrapper[7110]: I0313 01:14:16.463495 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463662 master-0 kubenswrapper[7110]: I0313 01:14:16.463517 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463816 master-0 kubenswrapper[7110]: I0313 01:14:16.463757 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463877 master-0 kubenswrapper[7110]: I0313 01:14:16.463821 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.463967 master-0 kubenswrapper[7110]: I0313 01:14:16.463934 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.464218 master-0 kubenswrapper[7110]: I0313 01:14:16.464182 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.464547 master-0 kubenswrapper[7110]: I0313 01:14:16.464477 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.466508 master-0 kubenswrapper[7110]: I0313 01:14:16.466470 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.491236 master-0 kubenswrapper[7110]: I0313 01:14:16.491194 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.528905 master-0 kubenswrapper[7110]: I0313 01:14:16.528834 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:14:16.563278 master-0 kubenswrapper[7110]: W0313 01:14:16.563239 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85149f21_7ba8_4891_82ef_0fef3d5d7863.slice/crio-61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79 WatchSource:0}: Error finding container 61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79: Status 404 returned error can't find the container with id 61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79 Mar 13 01:14:16.642792 master-0 kubenswrapper[7110]: I0313 01:14:16.642761 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_22711394-3d1a-4e8d-931b-5c3b23d89519/installer/0.log" Mar 13 01:14:16.642960 master-0 kubenswrapper[7110]: I0313 01:14:16.642826 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:16.766693 master-0 kubenswrapper[7110]: I0313 01:14:16.766542 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access\") pod \"22711394-3d1a-4e8d-931b-5c3b23d89519\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " Mar 13 01:14:16.766693 master-0 kubenswrapper[7110]: I0313 01:14:16.766663 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir\") pod \"22711394-3d1a-4e8d-931b-5c3b23d89519\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " Mar 13 01:14:16.766890 master-0 kubenswrapper[7110]: I0313 01:14:16.766762 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22711394-3d1a-4e8d-931b-5c3b23d89519" (UID: "22711394-3d1a-4e8d-931b-5c3b23d89519"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:16.766924 master-0 kubenswrapper[7110]: I0313 01:14:16.766886 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock\") pod \"22711394-3d1a-4e8d-931b-5c3b23d89519\" (UID: \"22711394-3d1a-4e8d-931b-5c3b23d89519\") " Mar 13 01:14:16.766989 master-0 kubenswrapper[7110]: I0313 01:14:16.766960 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock" (OuterVolumeSpecName: "var-lock") pod "22711394-3d1a-4e8d-931b-5c3b23d89519" (UID: "22711394-3d1a-4e8d-931b-5c3b23d89519"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:16.767486 master-0 kubenswrapper[7110]: I0313 01:14:16.767451 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:16.767526 master-0 kubenswrapper[7110]: I0313 01:14:16.767487 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22711394-3d1a-4e8d-931b-5c3b23d89519-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:16.769227 master-0 kubenswrapper[7110]: I0313 01:14:16.769178 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22711394-3d1a-4e8d-931b-5c3b23d89519" (UID: "22711394-3d1a-4e8d-931b-5c3b23d89519"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:16.866489 master-0 kubenswrapper[7110]: I0313 01:14:16.866335 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" event={"ID":"85149f21-7ba8-4891-82ef-0fef3d5d7863","Type":"ContainerStarted","Data":"2e8d9bdcd6f94bc5d59dda8365233249d91bac104c7683389de5c7d81691e53d"} Mar 13 01:14:16.866489 master-0 kubenswrapper[7110]: I0313 01:14:16.866397 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" event={"ID":"85149f21-7ba8-4891-82ef-0fef3d5d7863","Type":"ContainerStarted","Data":"61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79"} Mar 13 01:14:16.867937 master-0 kubenswrapper[7110]: I0313 01:14:16.867902 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_22711394-3d1a-4e8d-931b-5c3b23d89519/installer/0.log" Mar 13 01:14:16.867998 master-0 kubenswrapper[7110]: I0313 01:14:16.867958 7110 generic.go:334] "Generic (PLEG): container finished" podID="22711394-3d1a-4e8d-931b-5c3b23d89519" containerID="a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94" exitCode=1 Mar 13 01:14:16.867998 master-0 kubenswrapper[7110]: I0313 01:14:16.867980 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22711394-3d1a-4e8d-931b-5c3b23d89519-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:16.868064 master-0 kubenswrapper[7110]: I0313 01:14:16.867992 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"22711394-3d1a-4e8d-931b-5c3b23d89519","Type":"ContainerDied","Data":"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94"} Mar 13 01:14:16.868064 master-0 kubenswrapper[7110]: I0313 01:14:16.868031 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"22711394-3d1a-4e8d-931b-5c3b23d89519","Type":"ContainerDied","Data":"b5e4ebe6aed962e2cf987dbe83f5e029fc26156301238d9d1f2d7caf40c7d55b"} Mar 13 01:14:16.868064 master-0 kubenswrapper[7110]: I0313 01:14:16.868053 7110 scope.go:117] "RemoveContainer" containerID="a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94" Mar 13 01:14:16.868140 master-0 kubenswrapper[7110]: I0313 01:14:16.868069 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 01:14:16.885189 master-0 kubenswrapper[7110]: I0313 01:14:16.885166 7110 scope.go:117] "RemoveContainer" containerID="a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94" Mar 13 01:14:16.887029 master-0 kubenswrapper[7110]: E0313 01:14:16.887004 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94\": container with ID starting with a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94 not found: ID does not exist" containerID="a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94" Mar 13 01:14:16.887112 master-0 kubenswrapper[7110]: I0313 01:14:16.887032 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94"} err="failed to get container status \"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94\": rpc error: code = NotFound desc = could not find container \"a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94\": container with ID starting with a3248a226bc0b1ce415f9a9164726d8181306febd5cb8c1685d1a066a8daaf94 not found: ID does not exist" Mar 13 01:14:16.895657 master-0 kubenswrapper[7110]: I0313 01:14:16.891458 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" podStartSLOduration=0.891445759 podStartE2EDuration="891.445759ms" podCreationTimestamp="2026-03-13 01:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:16.890698338 +0000 UTC m=+58.175724804" watchObservedRunningTime="2026-03-13 01:14:16.891445759 +0000 UTC m=+58.176472215" Mar 13 01:14:16.920953 master-0 kubenswrapper[7110]: I0313 01:14:16.920703 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf0bd4d-3387-43c3-b9d5-61a044fa2138" path="/var/lib/kubelet/pods/bbf0bd4d-3387-43c3-b9d5-61a044fa2138/volumes" Mar 13 01:14:16.921195 master-0 kubenswrapper[7110]: I0313 01:14:16.921172 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:16.921253 master-0 kubenswrapper[7110]: I0313 01:14:16.921206 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 01:14:17.274211 master-0 kubenswrapper[7110]: I0313 01:14:17.274165 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-26mfw" Mar 13 01:14:18.316481 master-0 kubenswrapper[7110]: I0313 01:14:18.316106 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_d428cc1c-440b-4cb4-97d3-fe0f80b4d83b/installer/0.log" Mar 13 01:14:18.316481 master-0 kubenswrapper[7110]: I0313 01:14:18.316169 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:14:18.418008 master-0 kubenswrapper[7110]: I0313 01:14:18.417980 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access\") pod \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " Mar 13 01:14:18.418237 master-0 kubenswrapper[7110]: I0313 01:14:18.418223 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir\") pod \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " Mar 13 01:14:18.418331 master-0 kubenswrapper[7110]: I0313 01:14:18.418320 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock\") pod \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\" (UID: \"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b\") " Mar 13 01:14:18.418755 master-0 kubenswrapper[7110]: I0313 01:14:18.418709 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" (UID: "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:18.418871 master-0 kubenswrapper[7110]: I0313 01:14:18.418852 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock" (OuterVolumeSpecName: "var-lock") pod "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" (UID: "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:18.419358 master-0 kubenswrapper[7110]: I0313 01:14:18.419330 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:18.419509 master-0 kubenswrapper[7110]: I0313 01:14:18.419487 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:18.430968 master-0 kubenswrapper[7110]: I0313 01:14:18.430922 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" (UID: "d428cc1c-440b-4cb4-97d3-fe0f80b4d83b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:18.521025 master-0 kubenswrapper[7110]: I0313 01:14:18.520956 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:18.893856 master-0 kubenswrapper[7110]: I0313 01:14:18.893545 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_d428cc1c-440b-4cb4-97d3-fe0f80b4d83b/installer/0.log" Mar 13 01:14:18.893856 master-0 kubenswrapper[7110]: I0313 01:14:18.893701 7110 generic.go:334] "Generic (PLEG): container finished" podID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" containerID="24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee" exitCode=1 Mar 13 01:14:18.893856 master-0 kubenswrapper[7110]: I0313 01:14:18.893819 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b","Type":"ContainerDied","Data":"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee"} Mar 13 01:14:18.894228 master-0 kubenswrapper[7110]: I0313 01:14:18.893874 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"d428cc1c-440b-4cb4-97d3-fe0f80b4d83b","Type":"ContainerDied","Data":"57d826aba034860e3eaf21db2b258df77ed968ee0b8c07c04572dbe183f9e872"} Mar 13 01:14:18.894228 master-0 kubenswrapper[7110]: I0313 01:14:18.893913 7110 scope.go:117] "RemoveContainer" containerID="24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee" Mar 13 01:14:18.894228 master-0 kubenswrapper[7110]: I0313 01:14:18.894094 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 01:14:18.898161 master-0 kubenswrapper[7110]: I0313 01:14:18.898120 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" event={"ID":"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46","Type":"ContainerStarted","Data":"a72dfd541e68db5ad4503d006f471f64ee52515824cead1c6d79426761a13afa"} Mar 13 01:14:18.898755 master-0 kubenswrapper[7110]: I0313 01:14:18.898664 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:18.920540 master-0 kubenswrapper[7110]: I0313 01:14:18.920500 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22711394-3d1a-4e8d-931b-5c3b23d89519" path="/var/lib/kubelet/pods/22711394-3d1a-4e8d-931b-5c3b23d89519/volumes" Mar 13 01:14:18.921477 master-0 kubenswrapper[7110]: I0313 01:14:18.921453 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:18.936621 master-0 kubenswrapper[7110]: I0313 01:14:18.936563 7110 scope.go:117] "RemoveContainer" containerID="24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee" Mar 13 01:14:18.937138 master-0 kubenswrapper[7110]: I0313 01:14:18.937054 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" podStartSLOduration=7.948605442 podStartE2EDuration="12.937026419s" podCreationTimestamp="2026-03-13 01:14:06 +0000 UTC" firstStartedPulling="2026-03-13 01:14:13.061460212 +0000 UTC m=+54.346486728" lastFinishedPulling="2026-03-13 01:14:18.049881229 +0000 UTC m=+59.334907705" observedRunningTime="2026-03-13 01:14:18.933047528 +0000 UTC m=+60.218074044" watchObservedRunningTime="2026-03-13 01:14:18.937026419 +0000 UTC m=+60.222052915" Mar 13 01:14:18.952200 master-0 kubenswrapper[7110]: E0313 01:14:18.952131 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee\": container with ID starting with 24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee not found: ID does not exist" containerID="24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee" Mar 13 01:14:18.952340 master-0 kubenswrapper[7110]: I0313 01:14:18.952202 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee"} err="failed to get container status \"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee\": rpc error: code = NotFound desc = could not find container \"24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee\": container with ID starting with 24ebe348dbe30bb41ef181302eb5af64e442788df5ae3da7b857b8eb59786fee not found: ID does not exist" Mar 13 01:14:18.976658 master-0 kubenswrapper[7110]: I0313 01:14:18.975606 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:14:18.988542 master-0 kubenswrapper[7110]: I0313 01:14:18.988481 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 01:14:19.063129 master-0 kubenswrapper[7110]: I0313 01:14:19.063085 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:14:19.063312 master-0 kubenswrapper[7110]: E0313 01:14:19.063287 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22711394-3d1a-4e8d-931b-5c3b23d89519" containerName="installer" Mar 13 01:14:19.063312 master-0 kubenswrapper[7110]: I0313 01:14:19.063304 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="22711394-3d1a-4e8d-931b-5c3b23d89519" containerName="installer" Mar 13 01:14:19.063384 master-0 kubenswrapper[7110]: E0313 01:14:19.063319 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" containerName="installer" Mar 13 01:14:19.063384 master-0 kubenswrapper[7110]: I0313 01:14:19.063329 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" containerName="installer" Mar 13 01:14:19.063442 master-0 kubenswrapper[7110]: I0313 01:14:19.063404 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" containerName="installer" Mar 13 01:14:19.063442 master-0 kubenswrapper[7110]: I0313 01:14:19.063413 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="22711394-3d1a-4e8d-931b-5c3b23d89519" containerName="installer" Mar 13 01:14:19.063974 master-0 kubenswrapper[7110]: I0313 01:14:19.063932 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.070217 master-0 kubenswrapper[7110]: I0313 01:14:19.070172 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 01:14:19.070480 master-0 kubenswrapper[7110]: I0313 01:14:19.070196 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-wvk2x" Mar 13 01:14:19.078691 master-0 kubenswrapper[7110]: I0313 01:14:19.078653 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:14:19.228417 master-0 kubenswrapper[7110]: I0313 01:14:19.228334 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.228677 master-0 kubenswrapper[7110]: I0313 01:14:19.228488 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.228677 master-0 kubenswrapper[7110]: I0313 01:14:19.228597 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.329557 master-0 kubenswrapper[7110]: I0313 01:14:19.329506 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.330179 master-0 kubenswrapper[7110]: I0313 01:14:19.329581 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.330179 master-0 kubenswrapper[7110]: I0313 01:14:19.329649 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.330179 master-0 kubenswrapper[7110]: I0313 01:14:19.329750 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.330179 master-0 kubenswrapper[7110]: I0313 01:14:19.329801 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.355647 master-0 kubenswrapper[7110]: I0313 01:14:19.355582 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access\") pod \"installer-4-master-0\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.393155 master-0 kubenswrapper[7110]: I0313 01:14:19.393096 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:19.549462 master-0 kubenswrapper[7110]: I0313 01:14:19.549354 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:19.549656 master-0 kubenswrapper[7110]: I0313 01:14:19.549595 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="1567be70-7890-440e-b5d5-cae3efec8373" containerName="installer" containerID="cri-o://dd7ffbe9a1f83cc2aab5aa6ed625a109821b43ecbb8c570e367b00a33fc6e7b1" gracePeriod=30 Mar 13 01:14:19.853133 master-0 kubenswrapper[7110]: I0313 01:14:19.853089 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:14:19.855316 master-0 kubenswrapper[7110]: W0313 01:14:19.855273 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod415ee541_898b_41d1_98b0_c5e622776590.slice/crio-c045eff5fd3303798547b4a3da418ad7c494e1e5ecfe29042e3a182ce13215d4 WatchSource:0}: Error finding container c045eff5fd3303798547b4a3da418ad7c494e1e5ecfe29042e3a182ce13215d4: Status 404 returned error can't find the container with id c045eff5fd3303798547b4a3da418ad7c494e1e5ecfe29042e3a182ce13215d4 Mar 13 01:14:19.912249 master-0 kubenswrapper[7110]: I0313 01:14:19.912185 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"415ee541-898b-41d1-98b0-c5e622776590","Type":"ContainerStarted","Data":"c045eff5fd3303798547b4a3da418ad7c494e1e5ecfe29042e3a182ce13215d4"} Mar 13 01:14:21.997266 master-0 kubenswrapper[7110]: I0313 01:14:21.997218 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d428cc1c-440b-4cb4-97d3-fe0f80b4d83b" path="/var/lib/kubelet/pods/d428cc1c-440b-4cb4-97d3-fe0f80b4d83b/volumes" Mar 13 01:14:22.001658 master-0 kubenswrapper[7110]: I0313 01:14:21.999691 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"415ee541-898b-41d1-98b0-c5e622776590","Type":"ContainerStarted","Data":"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7"} Mar 13 01:14:22.032985 master-0 kubenswrapper[7110]: I0313 01:14:22.031908 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=3.031886065 podStartE2EDuration="3.031886065s" podCreationTimestamp="2026-03-13 01:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:22.028750331 +0000 UTC m=+63.313776817" watchObservedRunningTime="2026-03-13 01:14:22.031886065 +0000 UTC m=+63.316912521" Mar 13 01:14:22.143609 master-0 kubenswrapper[7110]: I0313 01:14:22.143530 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:22.144093 master-0 kubenswrapper[7110]: I0313 01:14:22.144055 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.145899 master-0 kubenswrapper[7110]: I0313 01:14:22.145850 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:14:22.153755 master-0 kubenswrapper[7110]: I0313 01:14:22.153716 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:22.160572 master-0 kubenswrapper[7110]: I0313 01:14:22.160524 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.160741 master-0 kubenswrapper[7110]: I0313 01:14:22.160598 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.160741 master-0 kubenswrapper[7110]: I0313 01:14:22.160662 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.261907 master-0 kubenswrapper[7110]: I0313 01:14:22.261762 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.261907 master-0 kubenswrapper[7110]: I0313 01:14:22.261884 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.261907 master-0 kubenswrapper[7110]: I0313 01:14:22.261891 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.262248 master-0 kubenswrapper[7110]: I0313 01:14:22.262021 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.262248 master-0 kubenswrapper[7110]: I0313 01:14:22.262080 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.285701 master-0 kubenswrapper[7110]: I0313 01:14:22.285626 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access\") pod \"installer-2-master-0\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.461082 master-0 kubenswrapper[7110]: I0313 01:14:22.460970 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:22.913350 master-0 kubenswrapper[7110]: I0313 01:14:22.913300 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:23.009453 master-0 kubenswrapper[7110]: I0313 01:14:23.009302 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0eeaf64d-deb4-4feb-a29c-44900e8b18ec","Type":"ContainerStarted","Data":"28e5fb54e7611a9fbc306434944a3d046a20359d0f293063960f0c6e578ab551"} Mar 13 01:14:23.784220 master-0 kubenswrapper[7110]: I0313 01:14:23.784145 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:14:23.784435 master-0 kubenswrapper[7110]: I0313 01:14:23.784250 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:14:23.784486 master-0 kubenswrapper[7110]: I0313 01:14:23.784429 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:14:23.784751 master-0 kubenswrapper[7110]: I0313 01:14:23.784719 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:14:23.784828 master-0 kubenswrapper[7110]: I0313 01:14:23.784792 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:14:23.787257 master-0 kubenswrapper[7110]: I0313 01:14:23.787214 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:14:23.787325 master-0 kubenswrapper[7110]: I0313 01:14:23.787259 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:14:23.790024 master-0 kubenswrapper[7110]: I0313 01:14:23.789975 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:14:23.790328 master-0 kubenswrapper[7110]: I0313 01:14:23.790129 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:14:23.791465 master-0 kubenswrapper[7110]: I0313 01:14:23.791405 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:14:23.886283 master-0 kubenswrapper[7110]: I0313 01:14:23.886208 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:14:23.886474 master-0 kubenswrapper[7110]: I0313 01:14:23.886317 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:14:23.889778 master-0 kubenswrapper[7110]: I0313 01:14:23.889722 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:14:23.891473 master-0 kubenswrapper[7110]: I0313 01:14:23.891418 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:14:23.988433 master-0 kubenswrapper[7110]: I0313 01:14:23.988359 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:14:23.993563 master-0 kubenswrapper[7110]: I0313 01:14:23.993519 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"multus-admission-controller-8d675b596-tq7n6\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:14:24.016661 master-0 kubenswrapper[7110]: I0313 01:14:24.016589 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0eeaf64d-deb4-4feb-a29c-44900e8b18ec","Type":"ContainerStarted","Data":"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7"} Mar 13 01:14:24.060431 master-0 kubenswrapper[7110]: I0313 01:14:24.060270 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:14:24.060684 master-0 kubenswrapper[7110]: I0313 01:14:24.060436 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:14:24.060880 master-0 kubenswrapper[7110]: I0313 01:14:24.060767 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:14:24.061206 master-0 kubenswrapper[7110]: I0313 01:14:24.061158 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:14:24.061413 master-0 kubenswrapper[7110]: I0313 01:14:24.061339 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:14:24.062581 master-0 kubenswrapper[7110]: I0313 01:14:24.062533 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:14:24.062699 master-0 kubenswrapper[7110]: I0313 01:14:24.062622 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:14:24.064835 master-0 kubenswrapper[7110]: I0313 01:14:24.064789 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:14:24.277128 master-0 kubenswrapper[7110]: I0313 01:14:24.276684 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=2.276663497 podStartE2EDuration="2.276663497s" podCreationTimestamp="2026-03-13 01:14:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:24.275775483 +0000 UTC m=+65.560801969" watchObservedRunningTime="2026-03-13 01:14:24.276663497 +0000 UTC m=+65.561689983" Mar 13 01:14:24.615954 master-0 kubenswrapper[7110]: I0313 01:14:24.615287 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t"] Mar 13 01:14:24.623398 master-0 kubenswrapper[7110]: I0313 01:14:24.623345 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj"] Mar 13 01:14:24.625125 master-0 kubenswrapper[7110]: I0313 01:14:24.625081 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq"] Mar 13 01:14:24.626565 master-0 kubenswrapper[7110]: I0313 01:14:24.626514 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz"] Mar 13 01:14:24.634906 master-0 kubenswrapper[7110]: W0313 01:14:24.634850 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4e6150_432c_4a11_b5a6_4d62dd701fc8.slice/crio-04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8 WatchSource:0}: Error finding container 04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8: Status 404 returned error can't find the container with id 04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8 Mar 13 01:14:24.811717 master-0 kubenswrapper[7110]: I0313 01:14:24.811666 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:14:24.816381 master-0 kubenswrapper[7110]: I0313 01:14:24.815601 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zh5fh"] Mar 13 01:14:24.825305 master-0 kubenswrapper[7110]: W0313 01:14:24.825266 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode68ab3cb_c372_45d9_a758_beaf4c213714.slice/crio-b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c WatchSource:0}: Error finding container b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c: Status 404 returned error can't find the container with id b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c Mar 13 01:14:24.829406 master-0 kubenswrapper[7110]: I0313 01:14:24.829349 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8"] Mar 13 01:14:24.833974 master-0 kubenswrapper[7110]: W0313 01:14:24.833945 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd94289_7109_4419_9a51_bd289082b9f5.slice/crio-ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07 WatchSource:0}: Error finding container ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07: Status 404 returned error can't find the container with id ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07 Mar 13 01:14:24.841937 master-0 kubenswrapper[7110]: I0313 01:14:24.841896 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-dszg5"] Mar 13 01:14:24.842261 master-0 kubenswrapper[7110]: W0313 01:14:24.842194 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4976e608_07a0_4cef_8fdd_7cec3324b4b5.slice/crio-54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822 WatchSource:0}: Error finding container 54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822: Status 404 returned error can't find the container with id 54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822 Mar 13 01:14:25.024679 master-0 kubenswrapper[7110]: I0313 01:14:25.024627 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" event={"ID":"6c88187c-d011-4043-a6d3-4a8a7ec4e204","Type":"ContainerStarted","Data":"a628e92ac4f34b60f238b76d4fc08c8cab73f3dfd7d9d1150c95d95292472f21"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.025845 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" event={"ID":"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71","Type":"ContainerStarted","Data":"339cc6449a0020231eef0158a934d4ae19f59a10f226d56a246c3dc49a8eebbe"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.026768 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"b656a6f1c70c3edb8a88d273e10ec19afe3e617046ee184903275fabe65867b3"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.029784 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"7c8d8dec582874bddc8ecb01a398186bbbb8f8957e5e30a3464ac033e65b39ee"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.029820 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.030657 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerStarted","Data":"ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.035410 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"deb7cab1203b4a1419f7e0b1b9f09a289fe9cf31f3d2c0d970bf2d1a0aef7884"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.035486 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.036673 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" event={"ID":"7f35cc1e-3376-4dbd-b215-2a32bf62cc71","Type":"ContainerStarted","Data":"d87d8f00b3827a6cc0d679f67563557686bd72d154906a3035b8f36d3110e48e"} Mar 13 01:14:25.039962 master-0 kubenswrapper[7110]: I0313 01:14:25.038060 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c"} Mar 13 01:14:25.689732 master-0 kubenswrapper[7110]: I0313 01:14:25.689691 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 01:14:25.690284 master-0 kubenswrapper[7110]: I0313 01:14:25.690265 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:25.692655 master-0 kubenswrapper[7110]: I0313 01:14:25.692006 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k6d2j" Mar 13 01:14:25.692655 master-0 kubenswrapper[7110]: I0313 01:14:25.692198 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 01:14:26.372369 master-0 kubenswrapper[7110]: I0313 01:14:25.704351 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 01:14:26.374589 master-0 kubenswrapper[7110]: I0313 01:14:26.374331 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.374589 master-0 kubenswrapper[7110]: I0313 01:14:26.374481 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.374589 master-0 kubenswrapper[7110]: I0313 01:14:26.374536 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.423400 master-0 kubenswrapper[7110]: I0313 01:14:26.422127 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:14:26.423400 master-0 kubenswrapper[7110]: I0313 01:14:26.422374 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-4-master-0" podUID="415ee541-898b-41d1-98b0-c5e622776590" containerName="installer" containerID="cri-o://a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7" gracePeriod=30 Mar 13 01:14:26.433979 master-0 kubenswrapper[7110]: I0313 01:14:26.433939 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"f24100e4c2fb97c1fbe480478718c8a70d101cd014aa2d5b877af732099a71ef"} Mar 13 01:14:26.487203 master-0 kubenswrapper[7110]: I0313 01:14:26.481335 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.487203 master-0 kubenswrapper[7110]: I0313 01:14:26.481455 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.487203 master-0 kubenswrapper[7110]: I0313 01:14:26.481488 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.487203 master-0 kubenswrapper[7110]: I0313 01:14:26.482517 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.487203 master-0 kubenswrapper[7110]: I0313 01:14:26.483233 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.508189 master-0 kubenswrapper[7110]: I0313 01:14:26.508129 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access\") pod \"installer-1-master-0\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.721434 master-0 kubenswrapper[7110]: I0313 01:14:26.721123 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:14:26.843890 master-0 kubenswrapper[7110]: I0313 01:14:26.843835 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:26.844305 master-0 kubenswrapper[7110]: I0313 01:14:26.844055 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerName="controller-manager" containerID="cri-o://cd4a7d811e9bd6dc45032d0de284983ff5924aa9cdac4cfdf6e3b3750db023a5" gracePeriod=30 Mar 13 01:14:26.857686 master-0 kubenswrapper[7110]: I0313 01:14:26.857587 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:26.858610 master-0 kubenswrapper[7110]: I0313 01:14:26.857922 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" podUID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" containerName="route-controller-manager" containerID="cri-o://a72dfd541e68db5ad4503d006f471f64ee52515824cead1c6d79426761a13afa" gracePeriod=30 Mar 13 01:14:28.445733 master-0 kubenswrapper[7110]: I0313 01:14:28.445691 7110 generic.go:334] "Generic (PLEG): container finished" podID="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" containerID="2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663" exitCode=0 Mar 13 01:14:28.446252 master-0 kubenswrapper[7110]: I0313 01:14:28.445764 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerDied","Data":"2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663"} Mar 13 01:14:28.446323 master-0 kubenswrapper[7110]: I0313 01:14:28.446301 7110 scope.go:117] "RemoveContainer" containerID="2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663" Mar 13 01:14:28.447804 master-0 kubenswrapper[7110]: I0313 01:14:28.447711 7110 generic.go:334] "Generic (PLEG): container finished" podID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerID="cd4a7d811e9bd6dc45032d0de284983ff5924aa9cdac4cfdf6e3b3750db023a5" exitCode=0 Mar 13 01:14:28.447804 master-0 kubenswrapper[7110]: I0313 01:14:28.447789 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" event={"ID":"2dd9c693-1890-427b-a011-7a3cc5f04bb5","Type":"ContainerDied","Data":"cd4a7d811e9bd6dc45032d0de284983ff5924aa9cdac4cfdf6e3b3750db023a5"} Mar 13 01:14:28.450059 master-0 kubenswrapper[7110]: I0313 01:14:28.449978 7110 generic.go:334] "Generic (PLEG): container finished" podID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" containerID="a72dfd541e68db5ad4503d006f471f64ee52515824cead1c6d79426761a13afa" exitCode=0 Mar 13 01:14:28.450059 master-0 kubenswrapper[7110]: I0313 01:14:28.450002 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" event={"ID":"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46","Type":"ContainerDied","Data":"a72dfd541e68db5ad4503d006f471f64ee52515824cead1c6d79426761a13afa"} Mar 13 01:14:29.094804 master-0 kubenswrapper[7110]: I0313 01:14:29.094680 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:29.102520 master-0 kubenswrapper[7110]: I0313 01:14:29.102487 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.127164 master-0 kubenswrapper[7110]: I0313 01:14:29.127108 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.127343 master-0 kubenswrapper[7110]: I0313 01:14:29.127204 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.127377 master-0 kubenswrapper[7110]: I0313 01:14:29.127331 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.229043 master-0 kubenswrapper[7110]: I0313 01:14:29.228832 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.229043 master-0 kubenswrapper[7110]: I0313 01:14:29.228884 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.229043 master-0 kubenswrapper[7110]: I0313 01:14:29.228925 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.229043 master-0 kubenswrapper[7110]: I0313 01:14:29.229008 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.229374 master-0 kubenswrapper[7110]: I0313 01:14:29.229311 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.476831 master-0 kubenswrapper[7110]: I0313 01:14:29.471978 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:29.566425 master-0 kubenswrapper[7110]: I0313 01:14:29.566317 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access\") pod \"installer-5-master-0\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:29.733270 master-0 kubenswrapper[7110]: I0313 01:14:29.732651 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:30.175998 master-0 kubenswrapper[7110]: I0313 01:14:30.175954 7110 patch_prober.go:28] interesting pod/controller-manager-6cc877748f-cvjwm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:14:30.175998 master-0 kubenswrapper[7110]: I0313 01:14:30.176015 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:14:31.238995 master-0 kubenswrapper[7110]: I0313 01:14:31.238795 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:31.245762 master-0 kubenswrapper[7110]: I0313 01:14:31.245053 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.279808 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7"] Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: E0313 01:14:31.280120 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerName="controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.280141 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerName="controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: E0313 01:14:31.280152 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" containerName="route-controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.280160 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" containerName="route-controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.280305 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" containerName="controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.280319 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" containerName="route-controller-manager" Mar 13 01:14:31.281042 master-0 kubenswrapper[7110]: I0313 01:14:31.280840 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.326204 master-0 kubenswrapper[7110]: I0313 01:14:31.326149 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-c7thh" Mar 13 01:14:31.327508 master-0 kubenswrapper[7110]: I0313 01:14:31.327474 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 01:14:31.343677 master-0 kubenswrapper[7110]: I0313 01:14:31.340703 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:14:31.343677 master-0 kubenswrapper[7110]: I0313 01:14:31.342122 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.344895 master-0 kubenswrapper[7110]: I0313 01:14:31.344799 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:14:31.346875 master-0 kubenswrapper[7110]: I0313 01:14:31.346852 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-b6b87" Mar 13 01:14:31.349585 master-0 kubenswrapper[7110]: I0313 01:14:31.349413 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7"] Mar 13 01:14:31.354447 master-0 kubenswrapper[7110]: I0313 01:14:31.354410 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca\") pod \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " Mar 13 01:14:31.354564 master-0 kubenswrapper[7110]: I0313 01:14:31.354466 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxgpx\" (UniqueName: \"kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx\") pod \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " Mar 13 01:14:31.354564 master-0 kubenswrapper[7110]: I0313 01:14:31.354507 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqrcn\" (UniqueName: \"kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn\") pod \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " Mar 13 01:14:31.354564 master-0 kubenswrapper[7110]: I0313 01:14:31.354535 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca\") pod \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " Mar 13 01:14:31.354564 master-0 kubenswrapper[7110]: I0313 01:14:31.354557 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert\") pod \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " Mar 13 01:14:31.354702 master-0 kubenswrapper[7110]: I0313 01:14:31.354590 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config\") pod \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " Mar 13 01:14:31.354702 master-0 kubenswrapper[7110]: I0313 01:14:31.354615 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert\") pod \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " Mar 13 01:14:31.354702 master-0 kubenswrapper[7110]: I0313 01:14:31.354691 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles\") pod \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\" (UID: \"2dd9c693-1890-427b-a011-7a3cc5f04bb5\") " Mar 13 01:14:31.354826 master-0 kubenswrapper[7110]: I0313 01:14:31.354738 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config\") pod \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\" (UID: \"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46\") " Mar 13 01:14:31.356123 master-0 kubenswrapper[7110]: I0313 01:14:31.355756 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config" (OuterVolumeSpecName: "config") pod "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" (UID: "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:31.356476 master-0 kubenswrapper[7110]: I0313 01:14:31.356453 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca" (OuterVolumeSpecName: "client-ca") pod "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" (UID: "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:31.361947 master-0 kubenswrapper[7110]: I0313 01:14:31.361842 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn" (OuterVolumeSpecName: "kube-api-access-sqrcn") pod "2dd9c693-1890-427b-a011-7a3cc5f04bb5" (UID: "2dd9c693-1890-427b-a011-7a3cc5f04bb5"). InnerVolumeSpecName "kube-api-access-sqrcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:31.363737 master-0 kubenswrapper[7110]: I0313 01:14:31.363717 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" (UID: "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:31.365767 master-0 kubenswrapper[7110]: I0313 01:14:31.365726 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx" (OuterVolumeSpecName: "kube-api-access-gxgpx") pod "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" (UID: "0e864dcb-fec1-47a6-9ce9-2bd2541aeb46"). InnerVolumeSpecName "kube-api-access-gxgpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:31.367918 master-0 kubenswrapper[7110]: I0313 01:14:31.367896 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2dd9c693-1890-427b-a011-7a3cc5f04bb5" (UID: "2dd9c693-1890-427b-a011-7a3cc5f04bb5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:31.368215 master-0 kubenswrapper[7110]: I0313 01:14:31.368164 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config" (OuterVolumeSpecName: "config") pod "2dd9c693-1890-427b-a011-7a3cc5f04bb5" (UID: "2dd9c693-1890-427b-a011-7a3cc5f04bb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:31.369878 master-0 kubenswrapper[7110]: I0313 01:14:31.369831 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2dd9c693-1890-427b-a011-7a3cc5f04bb5" (UID: "2dd9c693-1890-427b-a011-7a3cc5f04bb5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:14:31.378574 master-0 kubenswrapper[7110]: I0313 01:14:31.378480 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca" (OuterVolumeSpecName: "client-ca") pod "2dd9c693-1890-427b-a011-7a3cc5f04bb5" (UID: "2dd9c693-1890-427b-a011-7a3cc5f04bb5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:14:31.456126 master-0 kubenswrapper[7110]: I0313 01:14:31.456091 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.456292 master-0 kubenswrapper[7110]: I0313 01:14:31.456142 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.456292 master-0 kubenswrapper[7110]: I0313 01:14:31.456182 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.456292 master-0 kubenswrapper[7110]: I0313 01:14:31.456203 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.456292 master-0 kubenswrapper[7110]: I0313 01:14:31.456224 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.456292 master-0 kubenswrapper[7110]: I0313 01:14:31.456242 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-929r9\" (UniqueName: \"kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456321 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456334 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dd9c693-1890-427b-a011-7a3cc5f04bb5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456344 7110 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456352 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456361 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456369 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxgpx\" (UniqueName: \"kubernetes.io/projected/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-kube-api-access-gxgpx\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456377 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqrcn\" (UniqueName: \"kubernetes.io/projected/2dd9c693-1890-427b-a011-7a3cc5f04bb5-kube-api-access-sqrcn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456385 7110 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dd9c693-1890-427b-a011-7a3cc5f04bb5-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.456473 master-0 kubenswrapper[7110]: I0313 01:14:31.456393 7110 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:31.470788 master-0 kubenswrapper[7110]: I0313 01:14:31.470748 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" event={"ID":"2dd9c693-1890-427b-a011-7a3cc5f04bb5","Type":"ContainerDied","Data":"b48dbccd168c65099a9c5c6976a0c0e5771f2e97e68c7ababb65981cfa725005"} Mar 13 01:14:31.470935 master-0 kubenswrapper[7110]: I0313 01:14:31.470803 7110 scope.go:117] "RemoveContainer" containerID="cd4a7d811e9bd6dc45032d0de284983ff5924aa9cdac4cfdf6e3b3750db023a5" Mar 13 01:14:31.471120 master-0 kubenswrapper[7110]: I0313 01:14:31.471044 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc877748f-cvjwm" Mar 13 01:14:31.473813 master-0 kubenswrapper[7110]: I0313 01:14:31.473794 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" event={"ID":"0e864dcb-fec1-47a6-9ce9-2bd2541aeb46","Type":"ContainerDied","Data":"a0846383dbb010c8f2d8bcb3c1991688b6d64fdcfd195932c43a39ad6e3d6016"} Mar 13 01:14:31.473963 master-0 kubenswrapper[7110]: I0313 01:14:31.473947 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42" Mar 13 01:14:31.504454 master-0 kubenswrapper[7110]: I0313 01:14:31.504412 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:31.510993 master-0 kubenswrapper[7110]: I0313 01:14:31.510948 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cc877748f-cvjwm"] Mar 13 01:14:31.525847 master-0 kubenswrapper[7110]: I0313 01:14:31.525795 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:31.533448 master-0 kubenswrapper[7110]: I0313 01:14:31.533386 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d4d56c4b7-ndd42"] Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557309 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557394 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557443 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557480 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-929r9\" (UniqueName: \"kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557551 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.558264 master-0 kubenswrapper[7110]: I0313 01:14:31.557603 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.559421 master-0 kubenswrapper[7110]: I0313 01:14:31.559379 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.559941 master-0 kubenswrapper[7110]: I0313 01:14:31.559910 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.568936 master-0 kubenswrapper[7110]: I0313 01:14:31.568911 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.570870 master-0 kubenswrapper[7110]: I0313 01:14:31.570836 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.577987 master-0 kubenswrapper[7110]: I0313 01:14:31.577966 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:31.588329 master-0 kubenswrapper[7110]: I0313 01:14:31.588308 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-929r9\" (UniqueName: \"kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.651500 master-0 kubenswrapper[7110]: I0313 01:14:31.651391 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:14:31.690498 master-0 kubenswrapper[7110]: I0313 01:14:31.690464 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:32.752114 master-0 kubenswrapper[7110]: I0313 01:14:32.752065 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:32.752543 master-0 kubenswrapper[7110]: I0313 01:14:32.752324 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" containerName="installer" containerID="cri-o://a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7" gracePeriod=30 Mar 13 01:14:32.924490 master-0 kubenswrapper[7110]: I0313 01:14:32.924415 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e864dcb-fec1-47a6-9ce9-2bd2541aeb46" path="/var/lib/kubelet/pods/0e864dcb-fec1-47a6-9ce9-2bd2541aeb46/volumes" Mar 13 01:14:32.925796 master-0 kubenswrapper[7110]: I0313 01:14:32.925754 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd9c693-1890-427b-a011-7a3cc5f04bb5" path="/var/lib/kubelet/pods/2dd9c693-1890-427b-a011-7a3cc5f04bb5/volumes" Mar 13 01:14:33.384463 master-0 kubenswrapper[7110]: I0313 01:14:33.384360 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:14:33.385202 master-0 kubenswrapper[7110]: I0313 01:14:33.384951 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.387800 master-0 kubenswrapper[7110]: I0313 01:14:33.387742 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-bjp2n" Mar 13 01:14:33.387800 master-0 kubenswrapper[7110]: I0313 01:14:33.387791 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:14:33.390754 master-0 kubenswrapper[7110]: I0313 01:14:33.390713 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:14:33.391497 master-0 kubenswrapper[7110]: I0313 01:14:33.391413 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:14:33.392190 master-0 kubenswrapper[7110]: I0313 01:14:33.392114 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:14:33.392852 master-0 kubenswrapper[7110]: I0313 01:14:33.392611 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:14:33.398463 master-0 kubenswrapper[7110]: I0313 01:14:33.398435 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:14:33.402184 master-0 kubenswrapper[7110]: I0313 01:14:33.402149 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:14:33.491540 master-0 kubenswrapper[7110]: I0313 01:14:33.491489 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.491747 master-0 kubenswrapper[7110]: I0313 01:14:33.491550 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.491747 master-0 kubenswrapper[7110]: I0313 01:14:33.491681 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.491747 master-0 kubenswrapper[7110]: I0313 01:14:33.491735 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.491838 master-0 kubenswrapper[7110]: I0313 01:14:33.491778 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.594960 master-0 kubenswrapper[7110]: I0313 01:14:33.593252 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.594960 master-0 kubenswrapper[7110]: I0313 01:14:33.593338 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.594960 master-0 kubenswrapper[7110]: I0313 01:14:33.593365 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.594960 master-0 kubenswrapper[7110]: I0313 01:14:33.593380 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.594960 master-0 kubenswrapper[7110]: I0313 01:14:33.593413 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.596247 master-0 kubenswrapper[7110]: I0313 01:14:33.596208 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.597087 master-0 kubenswrapper[7110]: I0313 01:14:33.596938 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.598666 master-0 kubenswrapper[7110]: I0313 01:14:33.597938 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.600022 master-0 kubenswrapper[7110]: I0313 01:14:33.599956 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.611760 master-0 kubenswrapper[7110]: I0313 01:14:33.611729 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:33.704250 master-0 kubenswrapper[7110]: I0313 01:14:33.702349 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:34.168072 master-0 kubenswrapper[7110]: I0313 01:14:34.168012 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8"] Mar 13 01:14:34.168857 master-0 kubenswrapper[7110]: I0313 01:14:34.168825 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.171115 master-0 kubenswrapper[7110]: I0313 01:14:34.171084 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-bclr4" Mar 13 01:14:34.171115 master-0 kubenswrapper[7110]: I0313 01:14:34.171100 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 01:14:34.171264 master-0 kubenswrapper[7110]: I0313 01:14:34.171192 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 01:14:34.171264 master-0 kubenswrapper[7110]: I0313 01:14:34.171192 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 01:14:34.171355 master-0 kubenswrapper[7110]: I0313 01:14:34.171327 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 01:14:34.174023 master-0 kubenswrapper[7110]: I0313 01:14:34.173986 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 01:14:34.304026 master-0 kubenswrapper[7110]: I0313 01:14:34.302861 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.304026 master-0 kubenswrapper[7110]: I0313 01:14:34.302947 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zzf\" (UniqueName: \"kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.304026 master-0 kubenswrapper[7110]: I0313 01:14:34.303219 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.304026 master-0 kubenswrapper[7110]: I0313 01:14:34.303272 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.404921 master-0 kubenswrapper[7110]: I0313 01:14:34.404880 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.405183 master-0 kubenswrapper[7110]: I0313 01:14:34.404934 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.405183 master-0 kubenswrapper[7110]: I0313 01:14:34.404970 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zzf\" (UniqueName: \"kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.405183 master-0 kubenswrapper[7110]: I0313 01:14:34.405098 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.405794 master-0 kubenswrapper[7110]: I0313 01:14:34.405777 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.406134 master-0 kubenswrapper[7110]: I0313 01:14:34.406100 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.408772 master-0 kubenswrapper[7110]: I0313 01:14:34.408715 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.423594 master-0 kubenswrapper[7110]: I0313 01:14:34.423526 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zzf\" (UniqueName: \"kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf\") pod \"machine-approver-955fcfb87-jvdz8\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.493918 master-0 kubenswrapper[7110]: I0313 01:14:34.493867 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:14:34.759001 master-0 kubenswrapper[7110]: I0313 01:14:34.758951 7110 scope.go:117] "RemoveContainer" containerID="a72dfd541e68db5ad4503d006f471f64ee52515824cead1c6d79426761a13afa" Mar 13 01:14:35.076561 master-0 kubenswrapper[7110]: I0313 01:14:35.073146 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:35.174461 master-0 kubenswrapper[7110]: I0313 01:14:35.174324 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_0eeaf64d-deb4-4feb-a29c-44900e8b18ec/installer/0.log" Mar 13 01:14:35.174461 master-0 kubenswrapper[7110]: I0313 01:14:35.174397 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:35.281199 master-0 kubenswrapper[7110]: I0313 01:14:35.280300 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:35.297481 master-0 kubenswrapper[7110]: I0313 01:14:35.297428 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 01:14:35.318143 master-0 kubenswrapper[7110]: I0313 01:14:35.318061 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access\") pod \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " Mar 13 01:14:35.318197 master-0 kubenswrapper[7110]: I0313 01:14:35.318150 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock\") pod \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " Mar 13 01:14:35.318229 master-0 kubenswrapper[7110]: I0313 01:14:35.318211 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir\") pod \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\" (UID: \"0eeaf64d-deb4-4feb-a29c-44900e8b18ec\") " Mar 13 01:14:35.318521 master-0 kubenswrapper[7110]: I0313 01:14:35.318432 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0eeaf64d-deb4-4feb-a29c-44900e8b18ec" (UID: "0eeaf64d-deb4-4feb-a29c-44900e8b18ec"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:35.318521 master-0 kubenswrapper[7110]: I0313 01:14:35.318474 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock" (OuterVolumeSpecName: "var-lock") pod "0eeaf64d-deb4-4feb-a29c-44900e8b18ec" (UID: "0eeaf64d-deb4-4feb-a29c-44900e8b18ec"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:35.337461 master-0 kubenswrapper[7110]: I0313 01:14:35.337239 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0eeaf64d-deb4-4feb-a29c-44900e8b18ec" (UID: "0eeaf64d-deb4-4feb-a29c-44900e8b18ec"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:35.403760 master-0 kubenswrapper[7110]: I0313 01:14:35.403723 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf"] Mar 13 01:14:35.404099 master-0 kubenswrapper[7110]: E0313 01:14:35.404086 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" containerName="installer" Mar 13 01:14:35.404156 master-0 kubenswrapper[7110]: I0313 01:14:35.404147 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" containerName="installer" Mar 13 01:14:35.404291 master-0 kubenswrapper[7110]: I0313 01:14:35.404279 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" containerName="installer" Mar 13 01:14:35.404818 master-0 kubenswrapper[7110]: I0313 01:14:35.404804 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.414545 master-0 kubenswrapper[7110]: I0313 01:14:35.413370 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 01:14:35.420647 master-0 kubenswrapper[7110]: I0313 01:14:35.418556 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 01:14:35.420647 master-0 kubenswrapper[7110]: I0313 01:14:35.419333 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 01:14:35.420647 master-0 kubenswrapper[7110]: I0313 01:14:35.419876 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:35.420647 master-0 kubenswrapper[7110]: I0313 01:14:35.419888 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:35.420647 master-0 kubenswrapper[7110]: I0313 01:14:35.419897 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0eeaf64d-deb4-4feb-a29c-44900e8b18ec-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:35.423815 master-0 kubenswrapper[7110]: I0313 01:14:35.423793 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf"] Mar 13 01:14:35.433266 master-0 kubenswrapper[7110]: I0313 01:14:35.428785 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 01:14:35.450360 master-0 kubenswrapper[7110]: I0313 01:14:35.450316 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7"] Mar 13 01:14:35.461989 master-0 kubenswrapper[7110]: I0313 01:14:35.461965 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 01:14:35.462586 master-0 kubenswrapper[7110]: I0313 01:14:35.462572 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.484944 master-0 kubenswrapper[7110]: I0313 01:14:35.484083 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 01:14:35.518784 master-0 kubenswrapper[7110]: I0313 01:14:35.508291 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:14:35.522334 master-0 kubenswrapper[7110]: I0313 01:14:35.521479 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.522334 master-0 kubenswrapper[7110]: I0313 01:14:35.521525 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hhwp\" (UniqueName: \"kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.522334 master-0 kubenswrapper[7110]: I0313 01:14:35.521572 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.527982 master-0 kubenswrapper[7110]: W0313 01:14:35.527955 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcada5bf2_e208_4fd8_bdf5_de8cad31a665.slice/crio-fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3 WatchSource:0}: Error finding container fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3: Status 404 returned error can't find the container with id fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3 Mar 13 01:14:35.572985 master-0 kubenswrapper[7110]: I0313 01:14:35.572944 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" event={"ID":"7f35cc1e-3376-4dbd-b215-2a32bf62cc71","Type":"ContainerStarted","Data":"32e6a9ef39eb23d211cd1a76164dad7d4bb127d13bda96645831dee3624336c5"} Mar 13 01:14:35.574916 master-0 kubenswrapper[7110]: I0313 01:14:35.573830 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:14:35.579962 master-0 kubenswrapper[7110]: I0313 01:14:35.579944 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:14:35.584356 master-0 kubenswrapper[7110]: I0313 01:14:35.584335 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:14:35.588990 master-0 kubenswrapper[7110]: I0313 01:14:35.588964 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerStarted","Data":"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214"} Mar 13 01:14:35.589139 master-0 kubenswrapper[7110]: I0313 01:14:35.589123 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerStarted","Data":"1f0f66c570b42159d22d17bc71dbe363049699dde742de12df165fbf93335972"} Mar 13 01:14:35.615960 master-0 kubenswrapper[7110]: I0313 01:14:35.615925 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"ec1033057b9e888b0f7503a93c72842a5c6f60d6f3a4f15b0b1a235b091ecfbd"} Mar 13 01:14:35.620646 master-0 kubenswrapper[7110]: I0313 01:14:35.619401 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" event={"ID":"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71","Type":"ContainerStarted","Data":"4b247a210c27022b1561eb41d301780fd2c27c79755cbefbf94d558d94963294"} Mar 13 01:14:35.629091 master-0 kubenswrapper[7110]: I0313 01:14:35.629029 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.629267 master-0 kubenswrapper[7110]: I0313 01:14:35.629099 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.629365 master-0 kubenswrapper[7110]: I0313 01:14:35.629337 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_0eeaf64d-deb4-4feb-a29c-44900e8b18ec/installer/0.log" Mar 13 01:14:35.629424 master-0 kubenswrapper[7110]: I0313 01:14:35.629382 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.629424 master-0 kubenswrapper[7110]: I0313 01:14:35.629403 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0eeaf64d-deb4-4feb-a29c-44900e8b18ec","Type":"ContainerDied","Data":"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7"} Mar 13 01:14:35.629489 master-0 kubenswrapper[7110]: I0313 01:14:35.629437 7110 scope.go:117] "RemoveContainer" containerID="a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7" Mar 13 01:14:35.629489 master-0 kubenswrapper[7110]: I0313 01:14:35.629451 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 01:14:35.629553 master-0 kubenswrapper[7110]: I0313 01:14:35.629493 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.629553 master-0 kubenswrapper[7110]: I0313 01:14:35.629383 7110 generic.go:334] "Generic (PLEG): container finished" podID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" containerID="a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7" exitCode=1 Mar 13 01:14:35.629553 master-0 kubenswrapper[7110]: I0313 01:14:35.629526 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.629654 master-0 kubenswrapper[7110]: I0313 01:14:35.629573 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"0eeaf64d-deb4-4feb-a29c-44900e8b18ec","Type":"ContainerDied","Data":"28e5fb54e7611a9fbc306434944a3d046a20359d0f293063960f0c6e578ab551"} Mar 13 01:14:35.632757 master-0 kubenswrapper[7110]: I0313 01:14:35.629969 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhwp\" (UniqueName: \"kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.632757 master-0 kubenswrapper[7110]: I0313 01:14:35.630920 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.632971 master-0 kubenswrapper[7110]: I0313 01:14:35.632795 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.669240 master-0 kubenswrapper[7110]: I0313 01:14:35.669215 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_1567be70-7890-440e-b5d5-cae3efec8373/installer/0.log" Mar 13 01:14:35.669321 master-0 kubenswrapper[7110]: I0313 01:14:35.669259 7110 generic.go:334] "Generic (PLEG): container finished" podID="1567be70-7890-440e-b5d5-cae3efec8373" containerID="dd7ffbe9a1f83cc2aab5aa6ed625a109821b43ecbb8c570e367b00a33fc6e7b1" exitCode=1 Mar 13 01:14:35.669385 master-0 kubenswrapper[7110]: I0313 01:14:35.669348 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"1567be70-7890-440e-b5d5-cae3efec8373","Type":"ContainerDied","Data":"dd7ffbe9a1f83cc2aab5aa6ed625a109821b43ecbb8c570e367b00a33fc6e7b1"} Mar 13 01:14:35.681574 master-0 kubenswrapper[7110]: I0313 01:14:35.681259 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"835bf21e-47fc-49b7-b2cd-72e051f2d601","Type":"ContainerStarted","Data":"6881d92dd6bb6a6dee05e911ad4a3fe7a1b45827de1efa1e4930a097fdf5483f"} Mar 13 01:14:35.706754 master-0 kubenswrapper[7110]: I0313 01:14:35.700221 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerStarted","Data":"37ec0a7d20e51b2b48319f3f798e03141088e89b283f96688c4513f6cdc84e01"} Mar 13 01:14:35.706754 master-0 kubenswrapper[7110]: I0313 01:14:35.702385 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhwp\" (UniqueName: \"kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.708500 master-0 kubenswrapper[7110]: I0313 01:14:35.708454 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerStarted","Data":"13bf43dd31255e64913f1edd8b9b049ea7f9baf74595bd3516213a0e530b536c"} Mar 13 01:14:35.715917 master-0 kubenswrapper[7110]: I0313 01:14:35.715863 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05"} Mar 13 01:14:35.721562 master-0 kubenswrapper[7110]: I0313 01:14:35.717742 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:14:35.721562 master-0 kubenswrapper[7110]: I0313 01:14:35.720906 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerStarted","Data":"cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41"} Mar 13 01:14:35.735789 master-0 kubenswrapper[7110]: I0313 01:14:35.732714 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.735789 master-0 kubenswrapper[7110]: I0313 01:14:35.732799 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.735789 master-0 kubenswrapper[7110]: I0313 01:14:35.732921 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.735789 master-0 kubenswrapper[7110]: I0313 01:14:35.733258 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.735789 master-0 kubenswrapper[7110]: I0313 01:14:35.734576 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.736555 master-0 kubenswrapper[7110]: I0313 01:14:35.736503 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:14:35.736775 master-0 kubenswrapper[7110]: I0313 01:14:35.736613 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:14:35.791066 master-0 kubenswrapper[7110]: I0313 01:14:35.787582 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" event={"ID":"6c88187c-d011-4043-a6d3-4a8a7ec4e204","Type":"ContainerStarted","Data":"9b9e910c886ae717b561817a7ea8bb0a6f52815840a3145454514557347be4d2"} Mar 13 01:14:35.791066 master-0 kubenswrapper[7110]: I0313 01:14:35.789204 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:14:35.792196 master-0 kubenswrapper[7110]: I0313 01:14:35.792175 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:14:35.801610 master-0 kubenswrapper[7110]: I0313 01:14:35.800334 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.801610 master-0 kubenswrapper[7110]: I0313 01:14:35.801431 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:14:35.808440 master-0 kubenswrapper[7110]: I0313 01:14:35.808282 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"ee09247c905e7ec574cda5fc1232f0742111775db623db5f36b780f4329bda02"} Mar 13 01:14:35.810747 master-0 kubenswrapper[7110]: I0313 01:14:35.808844 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:14:35.825794 master-0 kubenswrapper[7110]: I0313 01:14:35.821490 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:14:35.825794 master-0 kubenswrapper[7110]: I0313 01:14:35.821767 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_1567be70-7890-440e-b5d5-cae3efec8373/installer/0.log" Mar 13 01:14:35.825794 master-0 kubenswrapper[7110]: I0313 01:14:35.821830 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:35.854672 master-0 kubenswrapper[7110]: I0313 01:14:35.852404 7110 scope.go:117] "RemoveContainer" containerID="a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7" Mar 13 01:14:35.854672 master-0 kubenswrapper[7110]: E0313 01:14:35.853103 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7\": container with ID starting with a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7 not found: ID does not exist" containerID="a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7" Mar 13 01:14:35.854672 master-0 kubenswrapper[7110]: I0313 01:14:35.853235 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7"} err="failed to get container status \"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7\": rpc error: code = NotFound desc = could not find container \"a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7\": container with ID starting with a7e7bc44522c41668981d5bed22716c114ebbce43508729e2e1c0e0baa44f1d7 not found: ID does not exist" Mar 13 01:14:35.945521 master-0 kubenswrapper[7110]: I0313 01:14:35.945349 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir\") pod \"1567be70-7890-440e-b5d5-cae3efec8373\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " Mar 13 01:14:35.945521 master-0 kubenswrapper[7110]: I0313 01:14:35.945403 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1567be70-7890-440e-b5d5-cae3efec8373" (UID: "1567be70-7890-440e-b5d5-cae3efec8373"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:35.945521 master-0 kubenswrapper[7110]: I0313 01:14:35.945453 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock\") pod \"1567be70-7890-440e-b5d5-cae3efec8373\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " Mar 13 01:14:35.946165 master-0 kubenswrapper[7110]: I0313 01:14:35.945529 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock" (OuterVolumeSpecName: "var-lock") pod "1567be70-7890-440e-b5d5-cae3efec8373" (UID: "1567be70-7890-440e-b5d5-cae3efec8373"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:35.946165 master-0 kubenswrapper[7110]: I0313 01:14:35.945868 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access\") pod \"1567be70-7890-440e-b5d5-cae3efec8373\" (UID: \"1567be70-7890-440e-b5d5-cae3efec8373\") " Mar 13 01:14:35.946251 master-0 kubenswrapper[7110]: I0313 01:14:35.946230 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:35.946295 master-0 kubenswrapper[7110]: I0313 01:14:35.946250 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1567be70-7890-440e-b5d5-cae3efec8373-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:35.967792 master-0 kubenswrapper[7110]: I0313 01:14:35.967195 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1567be70-7890-440e-b5d5-cae3efec8373" (UID: "1567be70-7890-440e-b5d5-cae3efec8373"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:36.050532 master-0 kubenswrapper[7110]: I0313 01:14:36.049783 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1567be70-7890-440e-b5d5-cae3efec8373-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:36.283875 master-0 kubenswrapper[7110]: I0313 01:14:36.283519 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf"] Mar 13 01:14:36.317813 master-0 kubenswrapper[7110]: W0313 01:14:36.310354 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e6e63d_3cf2_4bb5_883f_6219a0b52c3a.slice/crio-b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c WatchSource:0}: Error finding container b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c: Status 404 returned error can't find the container with id b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c Mar 13 01:14:36.368298 master-0 kubenswrapper[7110]: I0313 01:14:36.368220 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 01:14:36.383701 master-0 kubenswrapper[7110]: I0313 01:14:36.382761 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:36.386435 master-0 kubenswrapper[7110]: I0313 01:14:36.385766 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 01:14:36.633728 master-0 kubenswrapper[7110]: I0313 01:14:36.632110 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:14:36.633728 master-0 kubenswrapper[7110]: E0313 01:14:36.632342 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1567be70-7890-440e-b5d5-cae3efec8373" containerName="installer" Mar 13 01:14:36.633728 master-0 kubenswrapper[7110]: I0313 01:14:36.632356 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="1567be70-7890-440e-b5d5-cae3efec8373" containerName="installer" Mar 13 01:14:36.633728 master-0 kubenswrapper[7110]: I0313 01:14:36.632460 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="1567be70-7890-440e-b5d5-cae3efec8373" containerName="installer" Mar 13 01:14:36.633728 master-0 kubenswrapper[7110]: I0313 01:14:36.633287 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.668794 master-0 kubenswrapper[7110]: I0313 01:14:36.667901 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpfx\" (UniqueName: \"kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.668794 master-0 kubenswrapper[7110]: I0313 01:14:36.667967 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.668794 master-0 kubenswrapper[7110]: I0313 01:14:36.667991 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.669977 master-0 kubenswrapper[7110]: I0313 01:14:36.669918 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:14:36.731654 master-0 kubenswrapper[7110]: I0313 01:14:36.731169 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz"] Mar 13 01:14:36.750705 master-0 kubenswrapper[7110]: I0313 01:14:36.749830 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.752590 master-0 kubenswrapper[7110]: I0313 01:14:36.752456 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz"] Mar 13 01:14:36.755668 master-0 kubenswrapper[7110]: I0313 01:14:36.754207 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 01:14:36.755668 master-0 kubenswrapper[7110]: I0313 01:14:36.754394 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 01:14:36.755668 master-0 kubenswrapper[7110]: I0313 01:14:36.754521 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 01:14:36.755668 master-0 kubenswrapper[7110]: I0313 01:14:36.754648 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-6h5r7" Mar 13 01:14:36.768543 master-0 kubenswrapper[7110]: I0313 01:14:36.768518 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.768769 master-0 kubenswrapper[7110]: I0313 01:14:36.768734 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.768929 master-0 kubenswrapper[7110]: I0313 01:14:36.768917 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpfx\" (UniqueName: \"kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.770299 master-0 kubenswrapper[7110]: I0313 01:14:36.770251 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.770532 master-0 kubenswrapper[7110]: I0313 01:14:36.770505 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.807571 master-0 kubenswrapper[7110]: I0313 01:14:36.800436 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpfx\" (UniqueName: \"kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx\") pod \"redhat-marketplace-29mns\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:36.807571 master-0 kubenswrapper[7110]: I0313 01:14:36.806140 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:14:36.807571 master-0 kubenswrapper[7110]: I0313 01:14:36.807031 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:36.817106 master-0 kubenswrapper[7110]: I0313 01:14:36.817058 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"a6d1854232de9168b75b3d01a0fa6c20901eb4deee5086ea18aafc73d6104cff"} Mar 13 01:14:36.817106 master-0 kubenswrapper[7110]: I0313 01:14:36.817100 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c"} Mar 13 01:14:36.822853 master-0 kubenswrapper[7110]: I0313 01:14:36.820574 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:14:36.822853 master-0 kubenswrapper[7110]: I0313 01:14:36.822512 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"296a827430c5ddc3613e12467f9f67e5f7d7d6b28473db1f835af9250c8da399"} Mar 13 01:14:36.824605 master-0 kubenswrapper[7110]: I0313 01:14:36.824577 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-sslxh_039acb44-a9b3-4ad6-a091-be4d18edc34f/openshift-controller-manager-operator/0.log" Mar 13 01:14:36.824688 master-0 kubenswrapper[7110]: I0313 01:14:36.824606 7110 generic.go:334] "Generic (PLEG): container finished" podID="039acb44-a9b3-4ad6-a091-be4d18edc34f" containerID="757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a" exitCode=1 Mar 13 01:14:36.824688 master-0 kubenswrapper[7110]: I0313 01:14:36.824655 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerDied","Data":"757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a"} Mar 13 01:14:36.824896 master-0 kubenswrapper[7110]: I0313 01:14:36.824869 7110 scope.go:117] "RemoveContainer" containerID="757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a" Mar 13 01:14:36.856797 master-0 kubenswrapper[7110]: I0313 01:14:36.848680 7110 generic.go:334] "Generic (PLEG): container finished" podID="bfc49699-9428-4bff-804d-da0e60551759" containerID="ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd" exitCode=0 Mar 13 01:14:36.856797 master-0 kubenswrapper[7110]: I0313 01:14:36.848745 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerDied","Data":"ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd"} Mar 13 01:14:36.856797 master-0 kubenswrapper[7110]: I0313 01:14:36.849038 7110 scope.go:117] "RemoveContainer" containerID="ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd" Mar 13 01:14:36.871042 master-0 kubenswrapper[7110]: I0313 01:14:36.869783 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.871042 master-0 kubenswrapper[7110]: I0313 01:14:36.869841 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltxpc\" (UniqueName: \"kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.871042 master-0 kubenswrapper[7110]: I0313 01:14:36.869989 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"835bf21e-47fc-49b7-b2cd-72e051f2d601","Type":"ContainerStarted","Data":"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74"} Mar 13 01:14:36.871042 master-0 kubenswrapper[7110]: I0313 01:14:36.870032 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-5-master-0" podUID="835bf21e-47fc-49b7-b2cd-72e051f2d601" containerName="installer" containerID="cri-o://15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74" gracePeriod=30 Mar 13 01:14:36.880176 master-0 kubenswrapper[7110]: I0313 01:14:36.877448 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerStarted","Data":"71b98806c78a21853872bf216fdc04280da7bf4d8777bb06b2a922047a6a9e8c"} Mar 13 01:14:36.880176 master-0 kubenswrapper[7110]: I0313 01:14:36.879834 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerStarted","Data":"fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3"} Mar 13 01:14:36.900016 master-0 kubenswrapper[7110]: I0313 01:14:36.899965 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerStarted","Data":"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6"} Mar 13 01:14:36.900016 master-0 kubenswrapper[7110]: I0313 01:14:36.900018 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerStarted","Data":"35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e"} Mar 13 01:14:36.900414 master-0 kubenswrapper[7110]: I0313 01:14:36.900371 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:36.947284 master-0 kubenswrapper[7110]: I0313 01:14:36.946166 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eeaf64d-deb4-4feb-a29c-44900e8b18ec" path="/var/lib/kubelet/pods/0eeaf64d-deb4-4feb-a29c-44900e8b18ec/volumes" Mar 13 01:14:36.947284 master-0 kubenswrapper[7110]: I0313 01:14:36.946611 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerStarted","Data":"6b3d1f96b7eda0842ce0b60c494ed28d5b1988f57c59ae6dc2d45944467711cc"} Mar 13 01:14:36.952611 master-0 kubenswrapper[7110]: I0313 01:14:36.952215 7110 generic.go:334] "Generic (PLEG): container finished" podID="58035e42-37d8-48f6-9861-9b4ce6014119" containerID="662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9" exitCode=0 Mar 13 01:14:36.952611 master-0 kubenswrapper[7110]: I0313 01:14:36.952282 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerDied","Data":"662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9"} Mar 13 01:14:36.952788 master-0 kubenswrapper[7110]: I0313 01:14:36.952684 7110 scope.go:117] "RemoveContainer" containerID="662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.970957 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.971053 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.971072 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.971087 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8sjh\" (UniqueName: \"kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.971141 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxpc\" (UniqueName: \"kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.975995 master-0 kubenswrapper[7110]: I0313 01:14:36.972725 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerStarted","Data":"2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e"} Mar 13 01:14:36.977361 master-0 kubenswrapper[7110]: I0313 01:14:36.976965 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:36.985305 master-0 kubenswrapper[7110]: I0313 01:14:36.982789 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:14:37.004671 master-0 kubenswrapper[7110]: I0313 01:14:37.004601 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxpc\" (UniqueName: \"kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:37.019758 master-0 kubenswrapper[7110]: I0313 01:14:37.019606 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=12.019588211 podStartE2EDuration="12.019588211s" podCreationTimestamp="2026-03-13 01:14:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:37.018666227 +0000 UTC m=+78.303692713" watchObservedRunningTime="2026-03-13 01:14:37.019588211 +0000 UTC m=+78.304614677" Mar 13 01:14:37.022308 master-0 kubenswrapper[7110]: I0313 01:14:37.020107 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=8.020102655 podStartE2EDuration="8.020102655s" podCreationTimestamp="2026-03-13 01:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:36.995776191 +0000 UTC m=+78.280802667" watchObservedRunningTime="2026-03-13 01:14:37.020102655 +0000 UTC m=+78.305129121" Mar 13 01:14:37.037909 master-0 kubenswrapper[7110]: I0313 01:14:37.035153 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_1567be70-7890-440e-b5d5-cae3efec8373/installer/0.log" Mar 13 01:14:37.037909 master-0 kubenswrapper[7110]: I0313 01:14:37.035298 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 01:14:37.037909 master-0 kubenswrapper[7110]: I0313 01:14:37.035548 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"1567be70-7890-440e-b5d5-cae3efec8373","Type":"ContainerDied","Data":"26e2e0ea83f6424ae6714b415bd5b2d67ab4f3b4a0b91e6a186a712935f305c2"} Mar 13 01:14:37.037909 master-0 kubenswrapper[7110]: I0313 01:14:37.035577 7110 scope.go:117] "RemoveContainer" containerID="dd7ffbe9a1f83cc2aab5aa6ed625a109821b43ecbb8c570e367b00a33fc6e7b1" Mar 13 01:14:37.042614 master-0 kubenswrapper[7110]: I0313 01:14:37.042201 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerStarted","Data":"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d"} Mar 13 01:14:37.042614 master-0 kubenswrapper[7110]: I0313 01:14:37.042341 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:37.042614 master-0 kubenswrapper[7110]: I0313 01:14:37.042375 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerStarted","Data":"16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d"} Mar 13 01:14:37.053046 master-0 kubenswrapper[7110]: I0313 01:14:37.052509 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:14:37.053187 master-0 kubenswrapper[7110]: I0313 01:14:37.053029 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" podStartSLOduration=11.05301894 podStartE2EDuration="11.05301894s" podCreationTimestamp="2026-03-13 01:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:37.052180028 +0000 UTC m=+78.337206514" watchObservedRunningTime="2026-03-13 01:14:37.05301894 +0000 UTC m=+78.338045406" Mar 13 01:14:37.055345 master-0 kubenswrapper[7110]: I0313 01:14:37.055315 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:14:37.083242 master-0 kubenswrapper[7110]: I0313 01:14:37.082145 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:14:37.091932 master-0 kubenswrapper[7110]: I0313 01:14:37.091893 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.091932 master-0 kubenswrapper[7110]: I0313 01:14:37.091933 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8sjh\" (UniqueName: \"kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.092801 master-0 kubenswrapper[7110]: I0313 01:14:37.092648 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.094905 master-0 kubenswrapper[7110]: I0313 01:14:37.094873 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.097058 master-0 kubenswrapper[7110]: I0313 01:14:37.096867 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.123128 master-0 kubenswrapper[7110]: I0313 01:14:37.123071 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8sjh\" (UniqueName: \"kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh\") pod \"redhat-operators-lb5rz\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.155704 master-0 kubenswrapper[7110]: I0313 01:14:37.154068 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podStartSLOduration=11.146915855 podStartE2EDuration="11.146915855s" podCreationTimestamp="2026-03-13 01:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:37.131778958 +0000 UTC m=+78.416805434" watchObservedRunningTime="2026-03-13 01:14:37.146915855 +0000 UTC m=+78.431942341" Mar 13 01:14:37.157393 master-0 kubenswrapper[7110]: I0313 01:14:37.157357 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:14:37.164953 master-0 kubenswrapper[7110]: I0313 01:14:37.164916 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:14:37.201646 master-0 kubenswrapper[7110]: I0313 01:14:37.201372 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:37.207136 master-0 kubenswrapper[7110]: I0313 01:14:37.205428 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 01:14:37.274778 master-0 kubenswrapper[7110]: I0313 01:14:37.274551 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 13 01:14:37.277085 master-0 kubenswrapper[7110]: I0313 01:14:37.275268 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.285930 master-0 kubenswrapper[7110]: I0313 01:14:37.283140 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 13 01:14:37.302488 master-0 kubenswrapper[7110]: I0313 01:14:37.300290 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.302488 master-0 kubenswrapper[7110]: I0313 01:14:37.300336 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.400065 master-0 kubenswrapper[7110]: I0313 01:14:37.398080 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_835bf21e-47fc-49b7-b2cd-72e051f2d601/installer/0.log" Mar 13 01:14:37.400065 master-0 kubenswrapper[7110]: I0313 01:14:37.398707 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:37.401489 master-0 kubenswrapper[7110]: I0313 01:14:37.401453 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.401489 master-0 kubenswrapper[7110]: I0313 01:14:37.401486 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.401584 master-0 kubenswrapper[7110]: I0313 01:14:37.401555 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.454512 master-0 kubenswrapper[7110]: I0313 01:14:37.443347 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access\") pod \"revision-pruner-6-master-0\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.502013 master-0 kubenswrapper[7110]: I0313 01:14:37.501972 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access\") pod \"835bf21e-47fc-49b7-b2cd-72e051f2d601\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " Mar 13 01:14:37.502205 master-0 kubenswrapper[7110]: I0313 01:14:37.502102 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir\") pod \"835bf21e-47fc-49b7-b2cd-72e051f2d601\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " Mar 13 01:14:37.502205 master-0 kubenswrapper[7110]: I0313 01:14:37.502135 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock\") pod \"835bf21e-47fc-49b7-b2cd-72e051f2d601\" (UID: \"835bf21e-47fc-49b7-b2cd-72e051f2d601\") " Mar 13 01:14:37.502262 master-0 kubenswrapper[7110]: I0313 01:14:37.502218 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "835bf21e-47fc-49b7-b2cd-72e051f2d601" (UID: "835bf21e-47fc-49b7-b2cd-72e051f2d601"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:37.502302 master-0 kubenswrapper[7110]: I0313 01:14:37.502283 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock" (OuterVolumeSpecName: "var-lock") pod "835bf21e-47fc-49b7-b2cd-72e051f2d601" (UID: "835bf21e-47fc-49b7-b2cd-72e051f2d601"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:37.505945 master-0 kubenswrapper[7110]: I0313 01:14:37.505899 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "835bf21e-47fc-49b7-b2cd-72e051f2d601" (UID: "835bf21e-47fc-49b7-b2cd-72e051f2d601"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:37.512214 master-0 kubenswrapper[7110]: I0313 01:14:37.510490 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:37.512214 master-0 kubenswrapper[7110]: I0313 01:14:37.510534 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/835bf21e-47fc-49b7-b2cd-72e051f2d601-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:37.527204 master-0 kubenswrapper[7110]: I0313 01:14:37.527122 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:14:37.534671 master-0 kubenswrapper[7110]: W0313 01:14:37.534154 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d78757_5081_4711_992e_c8fd7891f9b7.slice/crio-e8a5ab75c1b7f63476173d508b85b9a4a92039b2b0c3f241f48d8c8d79ae1573 WatchSource:0}: Error finding container e8a5ab75c1b7f63476173d508b85b9a4a92039b2b0c3f241f48d8c8d79ae1573: Status 404 returned error can't find the container with id e8a5ab75c1b7f63476173d508b85b9a4a92039b2b0c3f241f48d8c8d79ae1573 Mar 13 01:14:37.592591 master-0 kubenswrapper[7110]: I0313 01:14:37.592534 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:37.612087 master-0 kubenswrapper[7110]: I0313 01:14:37.611977 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/835bf21e-47fc-49b7-b2cd-72e051f2d601-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:37.686825 master-0 kubenswrapper[7110]: I0313 01:14:37.686771 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz"] Mar 13 01:14:37.741314 master-0 kubenswrapper[7110]: I0313 01:14:37.741269 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:14:37.878300 master-0 kubenswrapper[7110]: I0313 01:14:37.875609 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 13 01:14:37.878300 master-0 kubenswrapper[7110]: E0313 01:14:37.875803 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835bf21e-47fc-49b7-b2cd-72e051f2d601" containerName="installer" Mar 13 01:14:37.878300 master-0 kubenswrapper[7110]: I0313 01:14:37.875814 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="835bf21e-47fc-49b7-b2cd-72e051f2d601" containerName="installer" Mar 13 01:14:37.878300 master-0 kubenswrapper[7110]: I0313 01:14:37.875907 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="835bf21e-47fc-49b7-b2cd-72e051f2d601" containerName="installer" Mar 13 01:14:37.878300 master-0 kubenswrapper[7110]: I0313 01:14:37.876220 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:37.892721 master-0 kubenswrapper[7110]: I0313 01:14:37.890552 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 13 01:14:37.916969 master-0 kubenswrapper[7110]: I0313 01:14:37.916328 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:37.916969 master-0 kubenswrapper[7110]: I0313 01:14:37.916370 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:37.916969 master-0 kubenswrapper[7110]: I0313 01:14:37.916401 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:37.971109 master-0 kubenswrapper[7110]: I0313 01:14:37.960764 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-master-0"] Mar 13 01:14:38.017446 master-0 kubenswrapper[7110]: I0313 01:14:38.017406 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.017590 master-0 kubenswrapper[7110]: I0313 01:14:38.017457 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.017590 master-0 kubenswrapper[7110]: I0313 01:14:38.017488 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.017590 master-0 kubenswrapper[7110]: I0313 01:14:38.017560 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.017710 master-0 kubenswrapper[7110]: I0313 01:14:38.017640 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.035237 master-0 kubenswrapper[7110]: I0313 01:14:38.035188 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access\") pod \"installer-6-master-0\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.050933 master-0 kubenswrapper[7110]: I0313 01:14:38.050770 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerStarted","Data":"167e9a0418be9c64d38402cc471015911f91f7d101628f86049fb49485d8495a"} Mar 13 01:14:38.052511 master-0 kubenswrapper[7110]: I0313 01:14:38.052182 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerStarted","Data":"39a55e527445a96c03fa72bdbedc444549a950a2be802dd370d0f86349876a95"} Mar 13 01:14:38.054557 master-0 kubenswrapper[7110]: I0313 01:14:38.054495 7110 generic.go:334] "Generic (PLEG): container finished" podID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerID="3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11" exitCode=0 Mar 13 01:14:38.054622 master-0 kubenswrapper[7110]: I0313 01:14:38.054555 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerDied","Data":"3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11"} Mar 13 01:14:38.054622 master-0 kubenswrapper[7110]: I0313 01:14:38.054579 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerStarted","Data":"489575f0a1d8a7c8e97c64f0529347190c5df02ad8ff4559632008aa4dd81545"} Mar 13 01:14:38.061730 master-0 kubenswrapper[7110]: I0313 01:14:38.061682 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerStarted","Data":"ac8bc98fe2e8dc99665ece6e4cbb170176bcf297370531768be4fdddc77674cc"} Mar 13 01:14:38.071373 master-0 kubenswrapper[7110]: I0313 01:14:38.070915 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=3.070898438 podStartE2EDuration="3.070898438s" podCreationTimestamp="2026-03-13 01:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:38.070096656 +0000 UTC m=+79.355123122" watchObservedRunningTime="2026-03-13 01:14:38.070898438 +0000 UTC m=+79.355924904" Mar 13 01:14:38.093622 master-0 kubenswrapper[7110]: I0313 01:14:38.090468 7110 generic.go:334] "Generic (PLEG): container finished" podID="19d78757-5081-4711-992e-c8fd7891f9b7" containerID="e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde" exitCode=0 Mar 13 01:14:38.093622 master-0 kubenswrapper[7110]: I0313 01:14:38.091170 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerDied","Data":"e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde"} Mar 13 01:14:38.093622 master-0 kubenswrapper[7110]: I0313 01:14:38.091193 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerStarted","Data":"e8a5ab75c1b7f63476173d508b85b9a4a92039b2b0c3f241f48d8c8d79ae1573"} Mar 13 01:14:38.096579 master-0 kubenswrapper[7110]: I0313 01:14:38.095183 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g"] Mar 13 01:14:38.096579 master-0 kubenswrapper[7110]: I0313 01:14:38.095952 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.103418 master-0 kubenswrapper[7110]: I0313 01:14:38.102970 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-sjhm5" Mar 13 01:14:38.103418 master-0 kubenswrapper[7110]: I0313 01:14:38.102995 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 01:14:38.107923 master-0 kubenswrapper[7110]: I0313 01:14:38.107183 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-sslxh_039acb44-a9b3-4ad6-a091-be4d18edc34f/openshift-controller-manager-operator/0.log" Mar 13 01:14:38.107923 master-0 kubenswrapper[7110]: I0313 01:14:38.107257 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerStarted","Data":"3ad76288e23214748b66d20552278056bf77691ab910aa1096214002b1b63ee0"} Mar 13 01:14:38.121026 master-0 kubenswrapper[7110]: I0313 01:14:38.119302 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6frm\" (UniqueName: \"kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.121026 master-0 kubenswrapper[7110]: I0313 01:14:38.119355 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.121026 master-0 kubenswrapper[7110]: I0313 01:14:38.119441 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.121026 master-0 kubenswrapper[7110]: I0313 01:14:38.119472 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.128415 master-0 kubenswrapper[7110]: I0313 01:14:38.128332 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g"] Mar 13 01:14:38.128580 master-0 kubenswrapper[7110]: I0313 01:14:38.128512 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"34d6500d42674d3ac28ad1da03d31ad6fc07a588196014c4a73a86965dd9deb9"} Mar 13 01:14:38.139521 master-0 kubenswrapper[7110]: I0313 01:14:38.139496 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_835bf21e-47fc-49b7-b2cd-72e051f2d601/installer/0.log" Mar 13 01:14:38.141302 master-0 kubenswrapper[7110]: I0313 01:14:38.141282 7110 generic.go:334] "Generic (PLEG): container finished" podID="835bf21e-47fc-49b7-b2cd-72e051f2d601" containerID="15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74" exitCode=1 Mar 13 01:14:38.141509 master-0 kubenswrapper[7110]: I0313 01:14:38.141497 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 01:14:38.141595 master-0 kubenswrapper[7110]: I0313 01:14:38.141546 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"835bf21e-47fc-49b7-b2cd-72e051f2d601","Type":"ContainerDied","Data":"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74"} Mar 13 01:14:38.141699 master-0 kubenswrapper[7110]: I0313 01:14:38.141614 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"835bf21e-47fc-49b7-b2cd-72e051f2d601","Type":"ContainerDied","Data":"6881d92dd6bb6a6dee05e911ad4a3fe7a1b45827de1efa1e4930a097fdf5483f"} Mar 13 01:14:38.141699 master-0 kubenswrapper[7110]: I0313 01:14:38.141683 7110 scope.go:117] "RemoveContainer" containerID="15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74" Mar 13 01:14:38.179735 master-0 kubenswrapper[7110]: I0313 01:14:38.179706 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerStarted","Data":"045cf1f0774d199db7359b9c5ba3caaa20fbbe198d9372303441cd3a9663f259"} Mar 13 01:14:38.207025 master-0 kubenswrapper[7110]: I0313 01:14:38.206985 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-6k2t7"] Mar 13 01:14:38.207937 master-0 kubenswrapper[7110]: I0313 01:14:38.207922 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.212986 master-0 kubenswrapper[7110]: I0313 01:14:38.212958 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-7wmj8" Mar 13 01:14:38.213712 master-0 kubenswrapper[7110]: I0313 01:14:38.213680 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 01:14:38.214394 master-0 kubenswrapper[7110]: I0313 01:14:38.214377 7110 scope.go:117] "RemoveContainer" containerID="15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74" Mar 13 01:14:38.215218 master-0 kubenswrapper[7110]: E0313 01:14:38.215172 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74\": container with ID starting with 15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74 not found: ID does not exist" containerID="15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74" Mar 13 01:14:38.215330 master-0 kubenswrapper[7110]: I0313 01:14:38.215307 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74"} err="failed to get container status \"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74\": rpc error: code = NotFound desc = could not find container \"15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74\": container with ID starting with 15c8295a7013f090b7f2ef7c2ac7d855db77f246f725cb9e4a64e892a195db74 not found: ID does not exist" Mar 13 01:14:38.225191 master-0 kubenswrapper[7110]: I0313 01:14:38.224066 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-6k2t7"] Mar 13 01:14:38.225337 master-0 kubenswrapper[7110]: I0313 01:14:38.225272 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6frm\" (UniqueName: \"kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.225398 master-0 kubenswrapper[7110]: I0313 01:14:38.225347 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.225834 master-0 kubenswrapper[7110]: I0313 01:14:38.225498 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtqjr\" (UniqueName: \"kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.225834 master-0 kubenswrapper[7110]: I0313 01:14:38.225528 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.225834 master-0 kubenswrapper[7110]: I0313 01:14:38.225562 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.225834 master-0 kubenswrapper[7110]: I0313 01:14:38.225586 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.227074 master-0 kubenswrapper[7110]: I0313 01:14:38.226995 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.227475 master-0 kubenswrapper[7110]: I0313 01:14:38.227425 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zvz2"] Mar 13 01:14:38.228745 master-0 kubenswrapper[7110]: I0313 01:14:38.228725 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.228838 master-0 kubenswrapper[7110]: I0313 01:14:38.228820 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.231147 master-0 kubenswrapper[7110]: I0313 01:14:38.231114 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.231758 master-0 kubenswrapper[7110]: I0313 01:14:38.231734 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.231958 master-0 kubenswrapper[7110]: I0313 01:14:38.231925 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.231958 master-0 kubenswrapper[7110]: I0313 01:14:38.231950 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 01:14:38.232044 master-0 kubenswrapper[7110]: I0313 01:14:38.231994 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 01:14:38.232044 master-0 kubenswrapper[7110]: I0313 01:14:38.232011 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 01:14:38.233095 master-0 kubenswrapper[7110]: I0313 01:14:38.233081 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.235283 master-0 kubenswrapper[7110]: I0313 01:14:38.233510 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 01:14:38.238168 master-0 kubenswrapper[7110]: I0313 01:14:38.238083 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6f5hw" Mar 13 01:14:38.238541 master-0 kubenswrapper[7110]: I0313 01:14:38.238499 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:14:38.275199 master-0 kubenswrapper[7110]: I0313 01:14:38.275157 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6frm\" (UniqueName: \"kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.280266 master-0 kubenswrapper[7110]: I0313 01:14:38.279426 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zvz2"] Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333250 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333297 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333326 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333361 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szx9m\" (UniqueName: \"kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333383 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333405 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqjr\" (UniqueName: \"kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333425 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.333441 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.334736 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.335403 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.342662 master-0 kubenswrapper[7110]: I0313 01:14:38.335824 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.351659 master-0 kubenswrapper[7110]: I0313 01:14:38.347906 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.365557 master-0 kubenswrapper[7110]: I0313 01:14:38.365521 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqjr\" (UniqueName: \"kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.385415 master-0 kubenswrapper[7110]: I0313 01:14:38.385297 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:38.398053 master-0 kubenswrapper[7110]: I0313 01:14:38.396253 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 01:14:38.437896 master-0 kubenswrapper[7110]: I0313 01:14:38.436007 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.437896 master-0 kubenswrapper[7110]: I0313 01:14:38.436089 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szx9m\" (UniqueName: \"kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.437896 master-0 kubenswrapper[7110]: I0313 01:14:38.436116 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.437896 master-0 kubenswrapper[7110]: I0313 01:14:38.436494 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.437896 master-0 kubenswrapper[7110]: I0313 01:14:38.436523 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.438272 master-0 kubenswrapper[7110]: I0313 01:14:38.438069 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb"] Mar 13 01:14:38.439662 master-0 kubenswrapper[7110]: I0313 01:14:38.438972 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.451661 master-0 kubenswrapper[7110]: I0313 01:14:38.450119 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 01:14:38.451661 master-0 kubenswrapper[7110]: I0313 01:14:38.450361 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 01:14:38.451661 master-0 kubenswrapper[7110]: I0313 01:14:38.450895 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-7q6zv" Mar 13 01:14:38.455653 master-0 kubenswrapper[7110]: I0313 01:14:38.455242 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:14:38.462121 master-0 kubenswrapper[7110]: I0313 01:14:38.457993 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb"] Mar 13 01:14:38.470301 master-0 kubenswrapper[7110]: I0313 01:14:38.470264 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szx9m\" (UniqueName: \"kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.537332 master-0 kubenswrapper[7110]: I0313 01:14:38.536840 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.537332 master-0 kubenswrapper[7110]: I0313 01:14:38.536890 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f82n\" (UniqueName: \"kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.537332 master-0 kubenswrapper[7110]: I0313 01:14:38.536928 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.570007 master-0 kubenswrapper[7110]: I0313 01:14:38.568890 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:14:38.576588 master-0 kubenswrapper[7110]: I0313 01:14:38.574974 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:14:38.640536 master-0 kubenswrapper[7110]: I0313 01:14:38.639023 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.640536 master-0 kubenswrapper[7110]: I0313 01:14:38.639065 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f82n\" (UniqueName: \"kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.640536 master-0 kubenswrapper[7110]: I0313 01:14:38.639099 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.640536 master-0 kubenswrapper[7110]: I0313 01:14:38.639933 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.643497 master-0 kubenswrapper[7110]: I0313 01:14:38.643468 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.660788 master-0 kubenswrapper[7110]: I0313 01:14:38.660756 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f82n\" (UniqueName: \"kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.780753 master-0 kubenswrapper[7110]: I0313 01:14:38.779601 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:14:38.780753 master-0 kubenswrapper[7110]: I0313 01:14:38.780688 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Mar 13 01:14:38.922241 master-0 kubenswrapper[7110]: I0313 01:14:38.922194 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1567be70-7890-440e-b5d5-cae3efec8373" path="/var/lib/kubelet/pods/1567be70-7890-440e-b5d5-cae3efec8373/volumes" Mar 13 01:14:38.922679 master-0 kubenswrapper[7110]: I0313 01:14:38.922653 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835bf21e-47fc-49b7-b2cd-72e051f2d601" path="/var/lib/kubelet/pods/835bf21e-47fc-49b7-b2cd-72e051f2d601/volumes" Mar 13 01:14:39.188807 master-0 kubenswrapper[7110]: I0313 01:14:39.188700 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerStarted","Data":"90f1b6677182d94c10ac2334ab8747391e467bc270fab2bd3e7e7b1c8a3cd1c7"} Mar 13 01:14:39.190941 master-0 kubenswrapper[7110]: I0313 01:14:39.190871 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerStarted","Data":"555967ac1c8966a10024222c10bd15df837fc12752f180fd00e26584a6a7eadd"} Mar 13 01:14:39.213765 master-0 kubenswrapper[7110]: I0313 01:14:39.213482 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-master-0" podStartSLOduration=2.213452178 podStartE2EDuration="2.213452178s" podCreationTimestamp="2026-03-13 01:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:14:39.2061012 +0000 UTC m=+80.491127706" watchObservedRunningTime="2026-03-13 01:14:39.213452178 +0000 UTC m=+80.498478674" Mar 13 01:14:39.219898 master-0 kubenswrapper[7110]: I0313 01:14:39.219846 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:14:39.222759 master-0 kubenswrapper[7110]: I0313 01:14:39.222727 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.225667 master-0 kubenswrapper[7110]: I0313 01:14:39.225602 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-qcwkf" Mar 13 01:14:39.237043 master-0 kubenswrapper[7110]: I0313 01:14:39.236951 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:14:39.250564 master-0 kubenswrapper[7110]: I0313 01:14:39.250522 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.250698 master-0 kubenswrapper[7110]: I0313 01:14:39.250569 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2pvt\" (UniqueName: \"kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.250698 master-0 kubenswrapper[7110]: I0313 01:14:39.250644 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.351947 master-0 kubenswrapper[7110]: I0313 01:14:39.351897 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.351947 master-0 kubenswrapper[7110]: I0313 01:14:39.351948 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2pvt\" (UniqueName: \"kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.353259 master-0 kubenswrapper[7110]: I0313 01:14:39.351979 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.353259 master-0 kubenswrapper[7110]: I0313 01:14:39.352379 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.353259 master-0 kubenswrapper[7110]: I0313 01:14:39.352598 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.371923 master-0 kubenswrapper[7110]: I0313 01:14:39.371808 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2pvt\" (UniqueName: \"kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt\") pod \"community-operators-cpp59\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:39.544785 master-0 kubenswrapper[7110]: I0313 01:14:39.544389 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:14:40.073490 master-0 kubenswrapper[7110]: I0313 01:14:40.073240 7110 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 01:14:40.073490 master-0 kubenswrapper[7110]: I0313 01:14:40.073435 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a" gracePeriod=30 Mar 13 01:14:40.073739 master-0 kubenswrapper[7110]: I0313 01:14:40.073536 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059" gracePeriod=30 Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.079977 7110 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: E0313 01:14:40.080220 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.080232 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: E0313 01:14:40.080260 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.080268 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.080368 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.080381 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 01:14:40.087525 master-0 kubenswrapper[7110]: I0313 01:14:40.081881 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167431 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167481 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167516 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167535 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167557 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.167698 master-0 kubenswrapper[7110]: I0313 01:14:40.167589 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.200653 master-0 kubenswrapper[7110]: I0313 01:14:40.200131 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerStarted","Data":"304e8e0944434514608c776ed75bc07cb5d1c2603e8ab5214e26636517baa5e9"} Mar 13 01:14:40.203317 master-0 kubenswrapper[7110]: I0313 01:14:40.203245 7110 generic.go:334] "Generic (PLEG): container finished" podID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerID="90f1b6677182d94c10ac2334ab8747391e467bc270fab2bd3e7e7b1c8a3cd1c7" exitCode=0 Mar 13 01:14:40.203317 master-0 kubenswrapper[7110]: I0313 01:14:40.203270 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerDied","Data":"90f1b6677182d94c10ac2334ab8747391e467bc270fab2bd3e7e7b1c8a3cd1c7"} Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270458 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270559 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270591 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270612 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270657 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270681 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270699 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270749 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270907 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270961 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.270994 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:40.272086 master-0 kubenswrapper[7110]: I0313 01:14:40.271027 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:14:41.212692 master-0 kubenswrapper[7110]: I0313 01:14:41.212623 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerStarted","Data":"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94"} Mar 13 01:14:41.214288 master-0 kubenswrapper[7110]: I0313 01:14:41.214244 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerStarted","Data":"6b91817f5b3f7a0651a092d44f47916346942c3944860ca84cb9f688537c7ce3"} Mar 13 01:14:41.918510 master-0 kubenswrapper[7110]: I0313 01:14:41.918481 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:42.023625 master-0 kubenswrapper[7110]: I0313 01:14:42.022402 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access\") pod \"9c9c1c81-eae9-4481-9870-b598deb1dcac\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " Mar 13 01:14:42.023625 master-0 kubenswrapper[7110]: I0313 01:14:42.022548 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir\") pod \"9c9c1c81-eae9-4481-9870-b598deb1dcac\" (UID: \"9c9c1c81-eae9-4481-9870-b598deb1dcac\") " Mar 13 01:14:42.023625 master-0 kubenswrapper[7110]: I0313 01:14:42.022738 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c9c1c81-eae9-4481-9870-b598deb1dcac" (UID: "9c9c1c81-eae9-4481-9870-b598deb1dcac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:42.023625 master-0 kubenswrapper[7110]: I0313 01:14:42.023143 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c9c1c81-eae9-4481-9870-b598deb1dcac-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:42.026141 master-0 kubenswrapper[7110]: I0313 01:14:42.026104 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c9c1c81-eae9-4481-9870-b598deb1dcac" (UID: "9c9c1c81-eae9-4481-9870-b598deb1dcac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:42.124434 master-0 kubenswrapper[7110]: I0313 01:14:42.124402 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c9c1c81-eae9-4481-9870-b598deb1dcac-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:42.220829 master-0 kubenswrapper[7110]: I0313 01:14:42.220783 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerDied","Data":"39a55e527445a96c03fa72bdbedc444549a950a2be802dd370d0f86349876a95"} Mar 13 01:14:42.220829 master-0 kubenswrapper[7110]: I0313 01:14:42.220826 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a55e527445a96c03fa72bdbedc444549a950a2be802dd370d0f86349876a95" Mar 13 01:14:42.220829 master-0 kubenswrapper[7110]: I0313 01:14:42.220798 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:14:42.223174 master-0 kubenswrapper[7110]: I0313 01:14:42.223136 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"fc0e1683bdab69600d9345034d7196f0df59bd227fe522be458cd66a46352f2e"} Mar 13 01:14:42.223174 master-0 kubenswrapper[7110]: I0313 01:14:42.223172 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"3120df40532f6b363d2622fdff9ab5bcc403e2dfd52885248c3514a1f9c6afff"} Mar 13 01:14:44.767951 master-0 kubenswrapper[7110]: I0313 01:14:44.767877 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:44.768482 master-0 kubenswrapper[7110]: I0313 01:14:44.767978 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:46.768689 master-0 kubenswrapper[7110]: I0313 01:14:46.768404 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:46.768689 master-0 kubenswrapper[7110]: I0313 01:14:46.768486 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:47.767873 master-0 kubenswrapper[7110]: I0313 01:14:47.767825 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:47.768161 master-0 kubenswrapper[7110]: I0313 01:14:47.767895 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:49.768417 master-0 kubenswrapper[7110]: I0313 01:14:49.768358 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:49.768871 master-0 kubenswrapper[7110]: I0313 01:14:49.768444 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:50.767267 master-0 kubenswrapper[7110]: I0313 01:14:50.767219 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:50.767448 master-0 kubenswrapper[7110]: I0313 01:14:50.767279 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:50.767448 master-0 kubenswrapper[7110]: I0313 01:14:50.767372 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:14:50.767943 master-0 kubenswrapper[7110]: I0313 01:14:50.767884 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:50.767995 master-0 kubenswrapper[7110]: I0313 01:14:50.767950 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:51.152782 master-0 kubenswrapper[7110]: E0313 01:14:51.152115 7110 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 13 01:14:52.631712 master-0 kubenswrapper[7110]: E0313 01:14:52.631470 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:14:42Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:14:42Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:14:42Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:14:42Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"],\\\"sizeBytes\\\":438654375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7\\\"],\\\"sizeBytes\\\":411585608},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7\\\"],\\\"sizeBytes\\\":407347126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3\\\"],\\\"sizeBytes\\\":396521759}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:14:52.768861 master-0 kubenswrapper[7110]: I0313 01:14:52.768799 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:52.769068 master-0 kubenswrapper[7110]: I0313 01:14:52.768897 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:52.769068 master-0 kubenswrapper[7110]: I0313 01:14:52.768967 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:14:52.769877 master-0 kubenswrapper[7110]: I0313 01:14:52.769842 7110 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 13 01:14:52.769938 master-0 kubenswrapper[7110]: I0313 01:14:52.769857 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:52.770045 master-0 kubenswrapper[7110]: I0313 01:14:52.769899 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" containerID="cri-o://61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7" gracePeriod=30 Mar 13 01:14:52.770243 master-0 kubenswrapper[7110]: I0313 01:14:52.769993 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:53.149275 master-0 kubenswrapper[7110]: E0313 01:14:53.149235 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 01:14:53.149595 master-0 kubenswrapper[7110]: I0313 01:14:53.149579 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:14:53.767724 master-0 kubenswrapper[7110]: I0313 01:14:53.767606 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:53.767724 master-0 kubenswrapper[7110]: I0313 01:14:53.767718 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:55.661257 master-0 kubenswrapper[7110]: W0313 01:14:55.661200 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3 WatchSource:0}: Error finding container e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3: Status 404 returned error can't find the container with id e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3 Mar 13 01:14:56.008804 master-0 kubenswrapper[7110]: I0313 01:14:56.008748 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_415ee541-898b-41d1-98b0-c5e622776590/installer/0.log" Mar 13 01:14:56.008925 master-0 kubenswrapper[7110]: I0313 01:14:56.008880 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:56.139555 master-0 kubenswrapper[7110]: I0313 01:14:56.139137 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access\") pod \"415ee541-898b-41d1-98b0-c5e622776590\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " Mar 13 01:14:56.139555 master-0 kubenswrapper[7110]: I0313 01:14:56.139354 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock\") pod \"415ee541-898b-41d1-98b0-c5e622776590\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " Mar 13 01:14:56.139555 master-0 kubenswrapper[7110]: I0313 01:14:56.139401 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir\") pod \"415ee541-898b-41d1-98b0-c5e622776590\" (UID: \"415ee541-898b-41d1-98b0-c5e622776590\") " Mar 13 01:14:56.139555 master-0 kubenswrapper[7110]: I0313 01:14:56.139423 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock" (OuterVolumeSpecName: "var-lock") pod "415ee541-898b-41d1-98b0-c5e622776590" (UID: "415ee541-898b-41d1-98b0-c5e622776590"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:56.139555 master-0 kubenswrapper[7110]: I0313 01:14:56.139527 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "415ee541-898b-41d1-98b0-c5e622776590" (UID: "415ee541-898b-41d1-98b0-c5e622776590"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:56.139869 master-0 kubenswrapper[7110]: I0313 01:14:56.139827 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:56.139913 master-0 kubenswrapper[7110]: I0313 01:14:56.139865 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/415ee541-898b-41d1-98b0-c5e622776590-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:56.141818 master-0 kubenswrapper[7110]: I0313 01:14:56.141782 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "415ee541-898b-41d1-98b0-c5e622776590" (UID: "415ee541-898b-41d1-98b0-c5e622776590"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:56.240860 master-0 kubenswrapper[7110]: I0313 01:14:56.240800 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/415ee541-898b-41d1-98b0-c5e622776590-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:56.302442 master-0 kubenswrapper[7110]: I0313 01:14:56.302382 7110 generic.go:334] "Generic (PLEG): container finished" podID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerID="e8004c606441f404a88824ad0f391a1736bc2bfc8d968e181bc7750c3498d909" exitCode=0 Mar 13 01:14:56.302740 master-0 kubenswrapper[7110]: I0313 01:14:56.302458 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"36b2d6ee-3ae7-444b-b327-f024a8a06ab7","Type":"ContainerDied","Data":"e8004c606441f404a88824ad0f391a1736bc2bfc8d968e181bc7750c3498d909"} Mar 13 01:14:56.305767 master-0 kubenswrapper[7110]: I0313 01:14:56.305739 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee" exitCode=1 Mar 13 01:14:56.305884 master-0 kubenswrapper[7110]: I0313 01:14:56.305791 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee"} Mar 13 01:14:56.305884 master-0 kubenswrapper[7110]: I0313 01:14:56.305819 7110 scope.go:117] "RemoveContainer" containerID="3b9e2f7b212305e97c078f4756a44dc73d31954cbfa09820c5689a8f4a927568" Mar 13 01:14:56.306441 master-0 kubenswrapper[7110]: I0313 01:14:56.306356 7110 scope.go:117] "RemoveContainer" containerID="578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee" Mar 13 01:14:56.308574 master-0 kubenswrapper[7110]: I0313 01:14:56.308537 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerStarted","Data":"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d"} Mar 13 01:14:56.311270 master-0 kubenswrapper[7110]: I0313 01:14:56.311034 7110 generic.go:334] "Generic (PLEG): container finished" podID="19d78757-5081-4711-992e-c8fd7891f9b7" containerID="925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3" exitCode=0 Mar 13 01:14:56.311270 master-0 kubenswrapper[7110]: I0313 01:14:56.311116 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerDied","Data":"925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3"} Mar 13 01:14:56.315907 master-0 kubenswrapper[7110]: I0313 01:14:56.315874 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"7064752da8b9ab2c25c9d26191f5ff198db26d98b3ebd28b1a794ad9c42435a8"} Mar 13 01:14:56.318727 master-0 kubenswrapper[7110]: I0313 01:14:56.318683 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_415ee541-898b-41d1-98b0-c5e622776590/installer/0.log" Mar 13 01:14:56.318788 master-0 kubenswrapper[7110]: I0313 01:14:56.318734 7110 generic.go:334] "Generic (PLEG): container finished" podID="415ee541-898b-41d1-98b0-c5e622776590" containerID="a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7" exitCode=1 Mar 13 01:14:56.318838 master-0 kubenswrapper[7110]: I0313 01:14:56.318791 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"415ee541-898b-41d1-98b0-c5e622776590","Type":"ContainerDied","Data":"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7"} Mar 13 01:14:56.318838 master-0 kubenswrapper[7110]: I0313 01:14:56.318811 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"415ee541-898b-41d1-98b0-c5e622776590","Type":"ContainerDied","Data":"c045eff5fd3303798547b4a3da418ad7c494e1e5ecfe29042e3a182ce13215d4"} Mar 13 01:14:56.318916 master-0 kubenswrapper[7110]: I0313 01:14:56.318877 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 01:14:56.321363 master-0 kubenswrapper[7110]: I0313 01:14:56.321293 7110 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9" exitCode=1 Mar 13 01:14:56.321440 master-0 kubenswrapper[7110]: I0313 01:14:56.321355 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9"} Mar 13 01:14:56.322204 master-0 kubenswrapper[7110]: I0313 01:14:56.322154 7110 scope.go:117] "RemoveContainer" containerID="d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9" Mar 13 01:14:56.327176 master-0 kubenswrapper[7110]: I0313 01:14:56.323209 7110 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" exitCode=0 Mar 13 01:14:56.327176 master-0 kubenswrapper[7110]: I0313 01:14:56.323237 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a"} Mar 13 01:14:56.327176 master-0 kubenswrapper[7110]: I0313 01:14:56.323265 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3"} Mar 13 01:14:56.355762 master-0 kubenswrapper[7110]: I0313 01:14:56.355720 7110 scope.go:117] "RemoveContainer" containerID="a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7" Mar 13 01:14:56.394676 master-0 kubenswrapper[7110]: I0313 01:14:56.392723 7110 scope.go:117] "RemoveContainer" containerID="a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7" Mar 13 01:14:56.394676 master-0 kubenswrapper[7110]: E0313 01:14:56.393712 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7\": container with ID starting with a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7 not found: ID does not exist" containerID="a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7" Mar 13 01:14:56.394676 master-0 kubenswrapper[7110]: I0313 01:14:56.393769 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7"} err="failed to get container status \"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7\": rpc error: code = NotFound desc = could not find container \"a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7\": container with ID starting with a98f3e2926a06fea2ad89c2df3c7c9f0cbd15416036c3ec44c6cff4245768ce7 not found: ID does not exist" Mar 13 01:14:56.466364 master-0 kubenswrapper[7110]: I0313 01:14:56.466310 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:14:56.768562 master-0 kubenswrapper[7110]: I0313 01:14:56.768508 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:56.769000 master-0 kubenswrapper[7110]: I0313 01:14:56.768589 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:14:57.267113 master-0 kubenswrapper[7110]: I0313 01:14:57.267050 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:14:57.335165 master-0 kubenswrapper[7110]: I0313 01:14:57.335113 7110 generic.go:334] "Generic (PLEG): container finished" podID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerID="cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d" exitCode=0 Mar 13 01:14:57.335409 master-0 kubenswrapper[7110]: I0313 01:14:57.335205 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerDied","Data":"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d"} Mar 13 01:14:57.341477 master-0 kubenswrapper[7110]: I0313 01:14:57.340405 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9"} Mar 13 01:14:57.344564 master-0 kubenswrapper[7110]: I0313 01:14:57.344501 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006"} Mar 13 01:14:57.592227 master-0 kubenswrapper[7110]: I0313 01:14:57.592179 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:57.761270 master-0 kubenswrapper[7110]: I0313 01:14:57.761200 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock\") pod \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " Mar 13 01:14:57.761510 master-0 kubenswrapper[7110]: I0313 01:14:57.761341 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access\") pod \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " Mar 13 01:14:57.761510 master-0 kubenswrapper[7110]: I0313 01:14:57.761369 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir\") pod \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\" (UID: \"36b2d6ee-3ae7-444b-b327-f024a8a06ab7\") " Mar 13 01:14:57.761510 master-0 kubenswrapper[7110]: I0313 01:14:57.761403 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock" (OuterVolumeSpecName: "var-lock") pod "36b2d6ee-3ae7-444b-b327-f024a8a06ab7" (UID: "36b2d6ee-3ae7-444b-b327-f024a8a06ab7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:57.761738 master-0 kubenswrapper[7110]: I0313 01:14:57.761615 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "36b2d6ee-3ae7-444b-b327-f024a8a06ab7" (UID: "36b2d6ee-3ae7-444b-b327-f024a8a06ab7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:14:57.761850 master-0 kubenswrapper[7110]: I0313 01:14:57.761811 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:57.766460 master-0 kubenswrapper[7110]: I0313 01:14:57.766395 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "36b2d6ee-3ae7-444b-b327-f024a8a06ab7" (UID: "36b2d6ee-3ae7-444b-b327-f024a8a06ab7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:14:57.863700 master-0 kubenswrapper[7110]: I0313 01:14:57.863542 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:57.863700 master-0 kubenswrapper[7110]: I0313 01:14:57.863586 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36b2d6ee-3ae7-444b-b327-f024a8a06ab7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:14:58.356142 master-0 kubenswrapper[7110]: I0313 01:14:58.356017 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"36b2d6ee-3ae7-444b-b327-f024a8a06ab7","Type":"ContainerDied","Data":"ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111"} Mar 13 01:14:58.356142 master-0 kubenswrapper[7110]: I0313 01:14:58.356077 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111" Mar 13 01:14:58.356142 master-0 kubenswrapper[7110]: I0313 01:14:58.356049 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 01:14:59.768416 master-0 kubenswrapper[7110]: I0313 01:14:59.768334 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:14:59.769068 master-0 kubenswrapper[7110]: I0313 01:14:59.768419 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:00.277078 master-0 kubenswrapper[7110]: I0313 01:15:00.276998 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:15:00.965273 master-0 kubenswrapper[7110]: I0313 01:15:00.965185 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:15:00.965273 master-0 kubenswrapper[7110]: I0313 01:15:00.965250 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:15:01.152974 master-0 kubenswrapper[7110]: E0313 01:15:01.152717 7110 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:02.632265 master-0 kubenswrapper[7110]: E0313 01:15:02.632219 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:02.768285 master-0 kubenswrapper[7110]: I0313 01:15:02.768218 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:02.768475 master-0 kubenswrapper[7110]: I0313 01:15:02.768326 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:04.393101 master-0 kubenswrapper[7110]: I0313 01:15:04.393010 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerStarted","Data":"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c"} Mar 13 01:15:04.395538 master-0 kubenswrapper[7110]: I0313 01:15:04.395476 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerStarted","Data":"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3"} Mar 13 01:15:05.767798 master-0 kubenswrapper[7110]: I0313 01:15:05.767715 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:05.767798 master-0 kubenswrapper[7110]: I0313 01:15:05.767793 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:06.466873 master-0 kubenswrapper[7110]: I0313 01:15:06.466802 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:15:06.983349 master-0 kubenswrapper[7110]: I0313 01:15:06.983246 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:15:06.984384 master-0 kubenswrapper[7110]: I0313 01:15:06.983389 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:15:07.053678 master-0 kubenswrapper[7110]: I0313 01:15:07.053541 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:15:07.159017 master-0 kubenswrapper[7110]: I0313 01:15:07.158932 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:15:07.159280 master-0 kubenswrapper[7110]: I0313 01:15:07.159043 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:15:08.219491 master-0 kubenswrapper[7110]: I0313 01:15:08.219385 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lb5rz" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="registry-server" probeResult="failure" output=< Mar 13 01:15:08.219491 master-0 kubenswrapper[7110]: timeout: failed to connect service ":50051" within 1s Mar 13 01:15:08.219491 master-0 kubenswrapper[7110]: > Mar 13 01:15:08.422464 master-0 kubenswrapper[7110]: I0313 01:15:08.422322 7110 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059" exitCode=0 Mar 13 01:15:08.768304 master-0 kubenswrapper[7110]: I0313 01:15:08.768247 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:08.768502 master-0 kubenswrapper[7110]: I0313 01:15:08.768326 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:09.328397 master-0 kubenswrapper[7110]: E0313 01:15:09.328327 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 01:15:09.468884 master-0 kubenswrapper[7110]: I0313 01:15:09.467832 7110 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:10.360119 master-0 kubenswrapper[7110]: I0313 01:15:10.352851 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 13 01:15:10.360119 master-0 kubenswrapper[7110]: I0313 01:15:10.352944 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:15:10.435670 master-0 kubenswrapper[7110]: I0313 01:15:10.435581 7110 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" exitCode=0 Mar 13 01:15:10.435670 master-0 kubenswrapper[7110]: I0313 01:15:10.435669 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee"} Mar 13 01:15:10.437845 master-0 kubenswrapper[7110]: I0313 01:15:10.437806 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 13 01:15:10.437965 master-0 kubenswrapper[7110]: I0313 01:15:10.437868 7110 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a" exitCode=137 Mar 13 01:15:10.437965 master-0 kubenswrapper[7110]: I0313 01:15:10.437931 7110 scope.go:117] "RemoveContainer" containerID="68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059" Mar 13 01:15:10.437965 master-0 kubenswrapper[7110]: I0313 01:15:10.437939 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:15:10.462321 master-0 kubenswrapper[7110]: I0313 01:15:10.460403 7110 scope.go:117] "RemoveContainer" containerID="a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a" Mar 13 01:15:10.482657 master-0 kubenswrapper[7110]: I0313 01:15:10.482576 7110 scope.go:117] "RemoveContainer" containerID="68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059" Mar 13 01:15:10.483117 master-0 kubenswrapper[7110]: E0313 01:15:10.483070 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059\": container with ID starting with 68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059 not found: ID does not exist" containerID="68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059" Mar 13 01:15:10.483222 master-0 kubenswrapper[7110]: I0313 01:15:10.483118 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059"} err="failed to get container status \"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059\": rpc error: code = NotFound desc = could not find container \"68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059\": container with ID starting with 68874048256ef5174e01b3364fffcb240e7ca45d30a09a97ed7d81f508369059 not found: ID does not exist" Mar 13 01:15:10.483222 master-0 kubenswrapper[7110]: I0313 01:15:10.483147 7110 scope.go:117] "RemoveContainer" containerID="a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a" Mar 13 01:15:10.483729 master-0 kubenswrapper[7110]: E0313 01:15:10.483663 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a\": container with ID starting with a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a not found: ID does not exist" containerID="a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a" Mar 13 01:15:10.483861 master-0 kubenswrapper[7110]: I0313 01:15:10.483749 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a"} err="failed to get container status \"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a\": rpc error: code = NotFound desc = could not find container \"a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a\": container with ID starting with a9c29d8936371344a1ccbc38b9ff3375977741408746ec28a0f2c19b38ce561a not found: ID does not exist" Mar 13 01:15:10.545625 master-0 kubenswrapper[7110]: I0313 01:15:10.545540 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 13 01:15:10.546019 master-0 kubenswrapper[7110]: I0313 01:15:10.545665 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 13 01:15:10.546162 master-0 kubenswrapper[7110]: I0313 01:15:10.546093 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:15:10.546162 master-0 kubenswrapper[7110]: I0313 01:15:10.546155 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:15:10.648172 master-0 kubenswrapper[7110]: I0313 01:15:10.648011 7110 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:15:10.648172 master-0 kubenswrapper[7110]: I0313 01:15:10.648081 7110 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:15:10.919974 master-0 kubenswrapper[7110]: I0313 01:15:10.919874 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 13 01:15:10.920308 master-0 kubenswrapper[7110]: I0313 01:15:10.920269 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:15:10.965337 master-0 kubenswrapper[7110]: I0313 01:15:10.965255 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:15:10.965573 master-0 kubenswrapper[7110]: I0313 01:15:10.965373 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:15:11.153558 master-0 kubenswrapper[7110]: E0313 01:15:11.153457 7110 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:11.768598 master-0 kubenswrapper[7110]: I0313 01:15:11.768487 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:11.768598 master-0 kubenswrapper[7110]: I0313 01:15:11.768583 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:12.632612 master-0 kubenswrapper[7110]: E0313 01:15:12.632511 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:14.094114 master-0 kubenswrapper[7110]: E0313 01:15:14.093948 7110 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c41995414032b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:14:40.073532203 +0000 UTC m=+81.358558669,LastTimestamp:2026-03-13 01:14:40.073532203 +0000 UTC m=+81.358558669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:15:14.767805 master-0 kubenswrapper[7110]: I0313 01:15:14.767715 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:14.768096 master-0 kubenswrapper[7110]: I0313 01:15:14.767821 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:17.768506 master-0 kubenswrapper[7110]: I0313 01:15:17.768422 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:17.769279 master-0 kubenswrapper[7110]: I0313 01:15:17.768517 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:18.494290 master-0 kubenswrapper[7110]: I0313 01:15:18.494239 7110 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7" exitCode=0 Mar 13 01:15:19.468267 master-0 kubenswrapper[7110]: I0313 01:15:19.467550 7110 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:20.768807 master-0 kubenswrapper[7110]: I0313 01:15:20.768735 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:20.769591 master-0 kubenswrapper[7110]: I0313 01:15:20.768811 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:20.964671 master-0 kubenswrapper[7110]: I0313 01:15:20.964580 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:15:20.964858 master-0 kubenswrapper[7110]: I0313 01:15:20.964696 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:15:21.153991 master-0 kubenswrapper[7110]: E0313 01:15:21.153839 7110 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:21.517641 master-0 kubenswrapper[7110]: I0313 01:15:21.517603 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_29e096ea-ca9d-477b-b0aa-1d10244d51d9/installer/0.log" Mar 13 01:15:21.517788 master-0 kubenswrapper[7110]: I0313 01:15:21.517674 7110 generic.go:334] "Generic (PLEG): container finished" podID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerID="167e9a0418be9c64d38402cc471015911f91f7d101628f86049fb49485d8495a" exitCode=1 Mar 13 01:15:21.519503 master-0 kubenswrapper[7110]: I0313 01:15:21.519477 7110 generic.go:334] "Generic (PLEG): container finished" podID="95d4e785-6663-417d-b380-6905773613c8" containerID="4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24" exitCode=0 Mar 13 01:15:21.521808 master-0 kubenswrapper[7110]: I0313 01:15:21.521785 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_47806631-9d60-4658-832d-f160f93f42ea/installer/0.log" Mar 13 01:15:21.521808 master-0 kubenswrapper[7110]: I0313 01:15:21.521816 7110 generic.go:334] "Generic (PLEG): container finished" podID="47806631-9d60-4658-832d-f160f93f42ea" containerID="71b98806c78a21853872bf216fdc04280da7bf4d8777bb06b2a922047a6a9e8c" exitCode=1 Mar 13 01:15:22.633499 master-0 kubenswrapper[7110]: E0313 01:15:22.633430 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:23.445053 master-0 kubenswrapper[7110]: E0313 01:15:23.444954 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 01:15:23.543031 master-0 kubenswrapper[7110]: I0313 01:15:23.542884 7110 generic.go:334] "Generic (PLEG): container finished" podID="916d9fc9-388b-4506-a17c-36a7f626356a" containerID="a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286" exitCode=0 Mar 13 01:15:23.768458 master-0 kubenswrapper[7110]: I0313 01:15:23.768372 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:23.769259 master-0 kubenswrapper[7110]: I0313 01:15:23.768463 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:24.552781 master-0 kubenswrapper[7110]: I0313 01:15:24.552725 7110 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" exitCode=0 Mar 13 01:15:26.768378 master-0 kubenswrapper[7110]: I0313 01:15:26.768284 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:26.769239 master-0 kubenswrapper[7110]: I0313 01:15:26.768378 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:28.577022 master-0 kubenswrapper[7110]: I0313 01:15:28.576934 7110 generic.go:334] "Generic (PLEG): container finished" podID="ca2fa86b-a966-49dc-8577-d2b54b111d14" containerID="18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e" exitCode=0 Mar 13 01:15:29.467742 master-0 kubenswrapper[7110]: I0313 01:15:29.467521 7110 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:29.584853 master-0 kubenswrapper[7110]: I0313 01:15:29.584758 7110 generic.go:334] "Generic (PLEG): container finished" podID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerID="c60b1887494d08fb5df2490e135e1d701bdbe7b6a6e136c3d75f17211fbf551b" exitCode=0 Mar 13 01:15:29.768024 master-0 kubenswrapper[7110]: I0313 01:15:29.767805 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:29.768024 master-0 kubenswrapper[7110]: I0313 01:15:29.767915 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:30.593524 master-0 kubenswrapper[7110]: I0313 01:15:30.593428 7110 generic.go:334] "Generic (PLEG): container finished" podID="22587300-2448-4862-9fd8-68197d17a9f2" containerID="0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8" exitCode=0 Mar 13 01:15:30.595575 master-0 kubenswrapper[7110]: I0313 01:15:30.595521 7110 generic.go:334] "Generic (PLEG): container finished" podID="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" containerID="451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b" exitCode=0 Mar 13 01:15:31.155185 master-0 kubenswrapper[7110]: E0313 01:15:31.155086 7110 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:31.155494 master-0 kubenswrapper[7110]: I0313 01:15:31.155195 7110 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 01:15:32.634751 master-0 kubenswrapper[7110]: E0313 01:15:32.634676 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:15:32.634751 master-0 kubenswrapper[7110]: E0313 01:15:32.634719 7110 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 01:15:32.767878 master-0 kubenswrapper[7110]: I0313 01:15:32.767762 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:32.767878 master-0 kubenswrapper[7110]: I0313 01:15:32.767836 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:34.621335 master-0 kubenswrapper[7110]: I0313 01:15:34.621269 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/0.log" Mar 13 01:15:34.622231 master-0 kubenswrapper[7110]: I0313 01:15:34.621898 7110 generic.go:334] "Generic (PLEG): container finished" podID="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" containerID="8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab" exitCode=1 Mar 13 01:15:35.767903 master-0 kubenswrapper[7110]: I0313 01:15:35.767788 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:35.767903 master-0 kubenswrapper[7110]: I0313 01:15:35.767876 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:38.658134 master-0 kubenswrapper[7110]: I0313 01:15:38.657963 7110 generic.go:334] "Generic (PLEG): container finished" podID="21cbea73-f779-43e4-b5ba-d6fa06275d34" containerID="d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649" exitCode=0 Mar 13 01:15:38.768104 master-0 kubenswrapper[7110]: I0313 01:15:38.767963 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:38.768342 master-0 kubenswrapper[7110]: I0313 01:15:38.768100 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:40.201821 master-0 kubenswrapper[7110]: I0313 01:15:40.201754 7110 status_manager.go:851] "Failed to get status for pod" podUID="cada5bf2-e208-4fd8-bdf5-de8cad31a665" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods control-plane-machine-set-operator-6686554ddc-w6qs7)" Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: E0313 01:15:40.540476 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0" Netns:"/var/run/netns/419f4548-4297-47ce-9599-9149084052f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: > Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: E0313 01:15:40.540602 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0" Netns:"/var/run/netns/419f4548-4297-47ce-9599-9149084052f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: > pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: E0313 01:15:40.540670 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0" Netns:"/var/run/netns/419f4548-4297-47ce-9599-9149084052f1" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: > pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:15:40.540927 master-0 kubenswrapper[7110]: E0313 01:15:40.540803 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager(2ce47660-f7cc-4669-a00d-83422f0f6d55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager(2ce47660-f7cc-4669-a00d-83422f0f6d55)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0\\\" Netns:\\\"/var/run/netns/419f4548-4297-47ce-9599-9149084052f1\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=8a528b2fe74eac92f2052525ed13b81b615fc81fd742c6db7340fb12042b39d0;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" podUID="2ce47660-f7cc-4669-a00d-83422f0f6d55" Mar 13 01:15:40.697241 master-0 kubenswrapper[7110]: I0313 01:15:40.697179 7110 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43" exitCode=0 Mar 13 01:15:40.697487 master-0 kubenswrapper[7110]: I0313 01:15:40.697261 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:15:40.697683 master-0 kubenswrapper[7110]: I0313 01:15:40.697662 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:15:40.793267 master-0 kubenswrapper[7110]: E0313 01:15:40.793207 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:15:40.793267 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8" Netns:"/var/run/netns/32daf096-8594-48bc-b4d6-8f3215f7654a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.793267 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.793267 master-0 kubenswrapper[7110]: > Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: E0313 01:15:40.793290 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8" Netns:"/var/run/netns/32daf096-8594-48bc-b4d6-8f3215f7654a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: > pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: E0313 01:15:40.793319 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8" Netns:"/var/run/netns/32daf096-8594-48bc-b4d6-8f3215f7654a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: > pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:15:40.793511 master-0 kubenswrapper[7110]: E0313 01:15:40.793392 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"insights-operator-8f89dfddd-6k2t7_openshift-insights(77fd9062-0f7d-4255-92ca-7e4325daeddd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"insights-operator-8f89dfddd-6k2t7_openshift-insights(77fd9062-0f7d-4255-92ca-7e4325daeddd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8\\\" Netns:\\\"/var/run/netns/32daf096-8594-48bc-b4d6-8f3215f7654a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=35b2eb3c289bb8a4e56a921922cac56ee3a7a6f537017b97e0ce40370b85caf8;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" podUID="77fd9062-0f7d-4255-92ca-7e4325daeddd" Mar 13 01:15:40.891158 master-0 kubenswrapper[7110]: E0313 01:15:40.891122 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:15:40.891158 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706" Netns:"/var/run/netns/363ff4de-17bc-499b-acc5-192dcb300068" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.891158 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.891158 master-0 kubenswrapper[7110]: > Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: E0313 01:15:40.891194 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706" Netns:"/var/run/netns/363ff4de-17bc-499b-acc5-192dcb300068" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: E0313 01:15:40.891249 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706" Netns:"/var/run/netns/363ff4de-17bc-499b-acc5-192dcb300068" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:15:40.891380 master-0 kubenswrapper[7110]: E0313 01:15:40.891316 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-9zvz2_openshift-marketplace(d23bbaec-b635-4649-b26e-2829f32d21f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-9zvz2_openshift-marketplace(d23bbaec-b635-4649-b26e-2829f32d21f0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706\\\" Netns:\\\"/var/run/netns/363ff4de-17bc-499b-acc5-192dcb300068\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=8d57d2e142f364814d0e5e6c071f3fcb2ed76cad8a88cf82021d6da2ab6ff706;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-9zvz2" podUID="d23bbaec-b635-4649-b26e-2829f32d21f0" Mar 13 01:15:40.898966 master-0 kubenswrapper[7110]: E0313 01:15:40.898939 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:15:40.898966 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d" Netns:"/var/run/netns/128a4604-607e-4022-9067-fa757062cd1a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.898966 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.898966 master-0 kubenswrapper[7110]: > Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: E0313 01:15:40.898984 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d" Netns:"/var/run/netns/128a4604-607e-4022-9067-fa757062cd1a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: E0313 01:15:40.899005 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d" Netns:"/var/run/netns/128a4604-607e-4022-9067-fa757062cd1a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:15:40.899133 master-0 kubenswrapper[7110]: E0313 01:15:40.899059 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-cpp59_openshift-marketplace(c3ae16e5-ba77-427f-b85f-5b354e7bfb9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-cpp59_openshift-marketplace(c3ae16e5-ba77-427f-b85f-5b354e7bfb9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d\\\" Netns:\\\"/var/run/netns/128a4604-607e-4022-9067-fa757062cd1a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=aa0cd3189a9e9439f3543cd0201f8c9a671b9de3b144f98020e3a9650145029d;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-cpp59" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: E0313 01:15:40.910847 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662" Netns:"/var/run/netns/cf2887f3-a006-49fc-895f-ae73b85943e6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: > Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: E0313 01:15:40.910897 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662" Netns:"/var/run/netns/cf2887f3-a006-49fc-895f-ae73b85943e6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: > pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: E0313 01:15:40.910916 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662" Netns:"/var/run/netns/cf2887f3-a006-49fc-895f-ae73b85943e6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: > pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:15:40.914000 master-0 kubenswrapper[7110]: E0313 01:15:40.910978 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api(d278ed70-786c-4b6c-9f04-f08ede704569)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api(d278ed70-786c-4b6c-9f04-f08ede704569)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662\\\" Netns:\\\"/var/run/netns/cf2887f3-a006-49fc-895f-ae73b85943e6\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=60187c25e4e16464ae34b3090edfd02e68d1304701b069bf6e190b8103302662;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" podUID="d278ed70-786c-4b6c-9f04-f08ede704569" Mar 13 01:15:41.155981 master-0 kubenswrapper[7110]: E0313 01:15:41.155801 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 13 01:15:41.704020 master-0 kubenswrapper[7110]: I0313 01:15:41.703845 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:15:41.704020 master-0 kubenswrapper[7110]: I0313 01:15:41.703894 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:15:41.704878 master-0 kubenswrapper[7110]: I0313 01:15:41.703845 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:15:41.704878 master-0 kubenswrapper[7110]: I0313 01:15:41.704186 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:15:41.704878 master-0 kubenswrapper[7110]: I0313 01:15:41.704752 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:15:41.704878 master-0 kubenswrapper[7110]: I0313 01:15:41.704814 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:15:41.705250 master-0 kubenswrapper[7110]: I0313 01:15:41.705060 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:15:41.708441 master-0 kubenswrapper[7110]: I0313 01:15:41.705568 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:15:41.769163 master-0 kubenswrapper[7110]: I0313 01:15:41.768336 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:41.769163 master-0 kubenswrapper[7110]: I0313 01:15:41.768433 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:44.768097 master-0 kubenswrapper[7110]: I0313 01:15:44.768011 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:44.768902 master-0 kubenswrapper[7110]: I0313 01:15:44.768103 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:44.922763 master-0 kubenswrapper[7110]: E0313 01:15:44.922678 7110 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:15:44.923015 master-0 kubenswrapper[7110]: E0313 01:15:44.922896 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 13 01:15:44.923092 master-0 kubenswrapper[7110]: I0313 01:15:44.923005 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:15:44.925110 master-0 kubenswrapper[7110]: I0313 01:15:44.925069 7110 scope.go:117] "RemoveContainer" containerID="8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab" Mar 13 01:15:44.932739 master-0 kubenswrapper[7110]: I0313 01:15:44.932672 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:15:45.733592 master-0 kubenswrapper[7110]: I0313 01:15:45.733495 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/0.log" Mar 13 01:15:47.768260 master-0 kubenswrapper[7110]: I0313 01:15:47.768180 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:47.768260 master-0 kubenswrapper[7110]: I0313 01:15:47.768252 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:48.096886 master-0 kubenswrapper[7110]: E0313 01:15:48.096583 7110 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-955fcfb87-jvdz8.189c41995d1b9ce8 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-955fcfb87-jvdz8,UID:23fbbe97-906a-4bce-9ab0-bf633d4f9dd7,APIVersion:v1,ResourceVersion:9128,FieldPath:spec.containers{machine-approver-controller},},Reason:Created,Message:Created container: machine-approver-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:14:40.225025256 +0000 UTC m=+81.510051722,LastTimestamp:2026-03-13 01:14:40.225025256 +0000 UTC m=+81.510051722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:15:50.767689 master-0 kubenswrapper[7110]: I0313 01:15:50.767567 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:50.768693 master-0 kubenswrapper[7110]: I0313 01:15:50.767750 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:51.357187 master-0 kubenswrapper[7110]: E0313 01:15:51.357067 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 13 01:15:53.012347 master-0 kubenswrapper[7110]: E0313 01:15:53.012120 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:15:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:15:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:15:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:15:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e9ee63a30a9b95b5801afa36e09fc583ec2cda3c5cb3c8676e478fea016abfa1\\\"],\\\"sizeBytes\\\":470680779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebee49810f493f9b566740bd61256fd40b897cc51423f1efa01a02bb57ce177d\\\"],\\\"sizeBytes\\\":467234714},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 13 01:15:53.767978 master-0 kubenswrapper[7110]: I0313 01:15:53.767865 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:53.768325 master-0 kubenswrapper[7110]: I0313 01:15:53.767968 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:56.767949 master-0 kubenswrapper[7110]: I0313 01:15:56.767878 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:56.768897 master-0 kubenswrapper[7110]: I0313 01:15:56.767955 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:15:57.819860 master-0 kubenswrapper[7110]: I0313 01:15:57.819724 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006" exitCode=1 Mar 13 01:15:59.277502 master-0 kubenswrapper[7110]: I0313 01:15:59.277319 7110 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-h4kkj container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Mar 13 01:15:59.277502 master-0 kubenswrapper[7110]: I0313 01:15:59.277397 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" podUID="21cbea73-f779-43e4-b5ba-d6fa06275d34" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Mar 13 01:15:59.768675 master-0 kubenswrapper[7110]: I0313 01:15:59.768507 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:15:59.769004 master-0 kubenswrapper[7110]: I0313 01:15:59.768749 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:01.759215 master-0 kubenswrapper[7110]: E0313 01:16:01.759068 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 13 01:16:01.850052 master-0 kubenswrapper[7110]: I0313 01:16:01.849966 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-kdn2l_70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/ingress-operator/0.log" Mar 13 01:16:01.850052 master-0 kubenswrapper[7110]: I0313 01:16:01.850040 7110 generic.go:334] "Generic (PLEG): container finished" podID="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" containerID="bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268" exitCode=1 Mar 13 01:16:02.768236 master-0 kubenswrapper[7110]: I0313 01:16:02.768135 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:02.771091 master-0 kubenswrapper[7110]: I0313 01:16:02.768283 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:03.013709 master-0 kubenswrapper[7110]: E0313 01:16:03.013529 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 13 01:16:05.768016 master-0 kubenswrapper[7110]: I0313 01:16:05.767935 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:05.768883 master-0 kubenswrapper[7110]: I0313 01:16:05.768021 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:08.768391 master-0 kubenswrapper[7110]: I0313 01:16:08.768338 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:08.769433 master-0 kubenswrapper[7110]: I0313 01:16:08.769384 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:11.768669 master-0 kubenswrapper[7110]: I0313 01:16:11.768547 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:11.769420 master-0 kubenswrapper[7110]: I0313 01:16:11.768690 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:12.560292 master-0 kubenswrapper[7110]: E0313 01:16:12.560148 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 13 01:16:13.014855 master-0 kubenswrapper[7110]: E0313 01:16:13.014753 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:14.768002 master-0 kubenswrapper[7110]: I0313 01:16:14.767936 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:14.768825 master-0 kubenswrapper[7110]: I0313 01:16:14.768020 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:17.768134 master-0 kubenswrapper[7110]: I0313 01:16:17.768080 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:17.768886 master-0 kubenswrapper[7110]: I0313 01:16:17.768172 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:18.936201 master-0 kubenswrapper[7110]: E0313 01:16:18.936135 7110 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:16:18.937086 master-0 kubenswrapper[7110]: E0313 01:16:18.936351 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.013s" Mar 13 01:16:18.937086 master-0 kubenswrapper[7110]: I0313 01:16:18.936383 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:16:18.937086 master-0 kubenswrapper[7110]: I0313 01:16:18.936450 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:16:18.937086 master-0 kubenswrapper[7110]: I0313 01:16:18.936474 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:16:18.937086 master-0 kubenswrapper[7110]: I0313 01:16:18.936815 7110 scope.go:117] "RemoveContainer" containerID="c60b1887494d08fb5df2490e135e1d701bdbe7b6a6e136c3d75f17211fbf551b" Mar 13 01:16:18.938493 master-0 kubenswrapper[7110]: I0313 01:16:18.938453 7110 scope.go:117] "RemoveContainer" containerID="0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8" Mar 13 01:16:18.938714 master-0 kubenswrapper[7110]: I0313 01:16:18.938666 7110 scope.go:117] "RemoveContainer" containerID="d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649" Mar 13 01:16:18.941568 master-0 kubenswrapper[7110]: I0313 01:16:18.939374 7110 scope.go:117] "RemoveContainer" containerID="4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24" Mar 13 01:16:18.941568 master-0 kubenswrapper[7110]: I0313 01:16:18.939684 7110 scope.go:117] "RemoveContainer" containerID="18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e" Mar 13 01:16:18.941568 master-0 kubenswrapper[7110]: I0313 01:16:18.940084 7110 scope.go:117] "RemoveContainer" containerID="a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286" Mar 13 01:16:18.949361 master-0 kubenswrapper[7110]: I0313 01:16:18.942729 7110 scope.go:117] "RemoveContainer" containerID="451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b" Mar 13 01:16:18.949361 master-0 kubenswrapper[7110]: I0313 01:16:18.944988 7110 scope.go:117] "RemoveContainer" containerID="029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006" Mar 13 01:16:18.951582 master-0 kubenswrapper[7110]: I0313 01:16:18.951521 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:16:19.465108 master-0 kubenswrapper[7110]: I0313 01:16:19.464975 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_29e096ea-ca9d-477b-b0aa-1d10244d51d9/installer/0.log" Mar 13 01:16:19.465108 master-0 kubenswrapper[7110]: I0313 01:16:19.465088 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:16:19.498880 master-0 kubenswrapper[7110]: I0313 01:16:19.498846 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_47806631-9d60-4658-832d-f160f93f42ea/installer/0.log" Mar 13 01:16:19.499010 master-0 kubenswrapper[7110]: I0313 01:16:19.498925 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:16:19.544343 master-0 kubenswrapper[7110]: I0313 01:16:19.544283 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock\") pod \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " Mar 13 01:16:19.544343 master-0 kubenswrapper[7110]: I0313 01:16:19.544335 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir\") pod \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544384 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir\") pod \"47806631-9d60-4658-832d-f160f93f42ea\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544423 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock\") pod \"47806631-9d60-4658-832d-f160f93f42ea\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544454 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access\") pod \"47806631-9d60-4658-832d-f160f93f42ea\" (UID: \"47806631-9d60-4658-832d-f160f93f42ea\") " Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544446 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock" (OuterVolumeSpecName: "var-lock") pod "29e096ea-ca9d-477b-b0aa-1d10244d51d9" (UID: "29e096ea-ca9d-477b-b0aa-1d10244d51d9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544497 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access\") pod \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\" (UID: \"29e096ea-ca9d-477b-b0aa-1d10244d51d9\") " Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544533 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47806631-9d60-4658-832d-f160f93f42ea" (UID: "47806631-9d60-4658-832d-f160f93f42ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:16:19.544576 master-0 kubenswrapper[7110]: I0313 01:16:19.544564 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29e096ea-ca9d-477b-b0aa-1d10244d51d9" (UID: "29e096ea-ca9d-477b-b0aa-1d10244d51d9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:16:19.544822 master-0 kubenswrapper[7110]: I0313 01:16:19.544589 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock" (OuterVolumeSpecName: "var-lock") pod "47806631-9d60-4658-832d-f160f93f42ea" (UID: "47806631-9d60-4658-832d-f160f93f42ea"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:16:19.544822 master-0 kubenswrapper[7110]: I0313 01:16:19.544706 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.544822 master-0 kubenswrapper[7110]: I0313 01:16:19.544725 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.544822 master-0 kubenswrapper[7110]: I0313 01:16:19.544738 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.544822 master-0 kubenswrapper[7110]: I0313 01:16:19.544751 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/47806631-9d60-4658-832d-f160f93f42ea-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.547557 master-0 kubenswrapper[7110]: I0313 01:16:19.547519 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47806631-9d60-4658-832d-f160f93f42ea" (UID: "47806631-9d60-4658-832d-f160f93f42ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:16:19.547985 master-0 kubenswrapper[7110]: I0313 01:16:19.547933 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29e096ea-ca9d-477b-b0aa-1d10244d51d9" (UID: "29e096ea-ca9d-477b-b0aa-1d10244d51d9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:16:19.646097 master-0 kubenswrapper[7110]: I0313 01:16:19.646034 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47806631-9d60-4658-832d-f160f93f42ea-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.646097 master-0 kubenswrapper[7110]: I0313 01:16:19.646080 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29e096ea-ca9d-477b-b0aa-1d10244d51d9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:16:19.970374 master-0 kubenswrapper[7110]: I0313 01:16:19.970326 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_29e096ea-ca9d-477b-b0aa-1d10244d51d9/installer/0.log" Mar 13 01:16:19.971124 master-0 kubenswrapper[7110]: I0313 01:16:19.970490 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:16:19.978486 master-0 kubenswrapper[7110]: I0313 01:16:19.978455 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_47806631-9d60-4658-832d-f160f93f42ea/installer/0.log" Mar 13 01:16:19.978737 master-0 kubenswrapper[7110]: I0313 01:16:19.978701 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:16:21.768712 master-0 kubenswrapper[7110]: I0313 01:16:21.768600 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:21.769468 master-0 kubenswrapper[7110]: I0313 01:16:21.768750 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:22.100908 master-0 kubenswrapper[7110]: E0313 01:16:22.100580 7110 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-955fcfb87-jvdz8.189c41995dc92a08 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-955fcfb87-jvdz8,UID:23fbbe97-906a-4bce-9ab0-bf633d4f9dd7,APIVersion:v1,ResourceVersion:9128,FieldPath:spec.containers{machine-approver-controller},},Reason:Started,Message:Started container machine-approver-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:14:40.236399112 +0000 UTC m=+81.521425578,LastTimestamp:2026-03-13 01:14:40.236399112 +0000 UTC m=+81.521425578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:16:23.015988 master-0 kubenswrapper[7110]: E0313 01:16:23.015880 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:24.162382 master-0 kubenswrapper[7110]: E0313 01:16:24.162259 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 13 01:16:24.768253 master-0 kubenswrapper[7110]: I0313 01:16:24.768108 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:24.768687 master-0 kubenswrapper[7110]: I0313 01:16:24.768293 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:27.767518 master-0 kubenswrapper[7110]: I0313 01:16:27.767423 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:27.768477 master-0 kubenswrapper[7110]: I0313 01:16:27.767510 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:28.034234 master-0 kubenswrapper[7110]: I0313 01:16:28.034106 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/0.log" Mar 13 01:16:28.034234 master-0 kubenswrapper[7110]: I0313 01:16:28.034185 7110 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6" exitCode=1 Mar 13 01:16:30.768274 master-0 kubenswrapper[7110]: I0313 01:16:30.768139 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:30.769222 master-0 kubenswrapper[7110]: I0313 01:16:30.768274 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:33.016594 master-0 kubenswrapper[7110]: E0313 01:16:33.016415 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:33.016594 master-0 kubenswrapper[7110]: E0313 01:16:33.016499 7110 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 01:16:33.768190 master-0 kubenswrapper[7110]: I0313 01:16:33.768074 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:33.768190 master-0 kubenswrapper[7110]: I0313 01:16:33.768181 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:36.769020 master-0 kubenswrapper[7110]: I0313 01:16:36.768912 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:36.769861 master-0 kubenswrapper[7110]: I0313 01:16:36.769021 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:37.364266 master-0 kubenswrapper[7110]: E0313 01:16:37.364128 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 13 01:16:39.768788 master-0 kubenswrapper[7110]: I0313 01:16:39.768682 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:39.769774 master-0 kubenswrapper[7110]: I0313 01:16:39.768790 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:40.205834 master-0 kubenswrapper[7110]: I0313 01:16:40.205737 7110 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods bootstrap-kube-scheduler-master-0)" Mar 13 01:16:41.128370 master-0 kubenswrapper[7110]: I0313 01:16:41.128218 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0b3a64f4-e94f-4916-8c91-a255d987735d/installer/0.log" Mar 13 01:16:41.128370 master-0 kubenswrapper[7110]: I0313 01:16:41.128294 7110 generic.go:334] "Generic (PLEG): container finished" podID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerID="6b91817f5b3f7a0651a092d44f47916346942c3944860ca84cb9f688537c7ce3" exitCode=1 Mar 13 01:16:41.131119 master-0 kubenswrapper[7110]: I0313 01:16:41.130289 7110 generic.go:334] "Generic (PLEG): container finished" podID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerID="a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05" exitCode=0 Mar 13 01:16:41.133719 master-0 kubenswrapper[7110]: I0313 01:16:41.133619 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/0.log" Mar 13 01:16:41.133820 master-0 kubenswrapper[7110]: I0313 01:16:41.133768 7110 generic.go:334] "Generic (PLEG): container finished" podID="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" containerID="5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b" exitCode=1 Mar 13 01:16:41.520281 master-0 kubenswrapper[7110]: E0313 01:16:41.520238 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:16:41.520281 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20" Netns:"/var/run/netns/7ab4bd3d-3d2a-4137-9aee-9bb261de53d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:41.520281 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:41.520281 master-0 kubenswrapper[7110]: > Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: E0313 01:16:41.520303 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20" Netns:"/var/run/netns/7ab4bd3d-3d2a-4137-9aee-9bb261de53d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: > pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: E0313 01:16:41.520323 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20" Netns:"/var/run/netns/7ab4bd3d-3d2a-4137-9aee-9bb261de53d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: > pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:16:41.520533 master-0 kubenswrapper[7110]: E0313 01:16:41.520381 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager(2ce47660-f7cc-4669-a00d-83422f0f6d55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager(2ce47660-f7cc-4669-a00d-83422f0f6d55)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-68f6795949-v9w8g_openshift-operator-lifecycle-manager_2ce47660-f7cc-4669-a00d-83422f0f6d55_0(1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20): error adding pod openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20\\\" Netns:\\\"/var/run/netns/7ab4bd3d-3d2a-4137-9aee-9bb261de53d7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-68f6795949-v9w8g;K8S_POD_INFRA_CONTAINER_ID=1932b318046cc2e98b5dbbb88f750937661f0160a1466991c8c7cc862089bb20;K8S_POD_UID=2ce47660-f7cc-4669-a00d-83422f0f6d55\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g/2ce47660-f7cc-4669-a00d-83422f0f6d55]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-68f6795949-v9w8g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-68f6795949-v9w8g?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" podUID="2ce47660-f7cc-4669-a00d-83422f0f6d55" Mar 13 01:16:42.139221 master-0 kubenswrapper[7110]: I0313 01:16:42.139082 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:16:42.140127 master-0 kubenswrapper[7110]: I0313 01:16:42.139590 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:16:42.661163 master-0 kubenswrapper[7110]: E0313 01:16:42.661112 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:16:42.661163 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6" Netns:"/var/run/netns/aac74c3a-4095-447a-9307-a3f08aa837e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.661163 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.661163 master-0 kubenswrapper[7110]: > Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: E0313 01:16:42.661180 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6" Netns:"/var/run/netns/aac74c3a-4095-447a-9307-a3f08aa837e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: > pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: E0313 01:16:42.661201 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6" Netns:"/var/run/netns/aac74c3a-4095-447a-9307-a3f08aa837e3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd" Path:"" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: > pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:16:42.661360 master-0 kubenswrapper[7110]: E0313 01:16:42.661264 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"insights-operator-8f89dfddd-6k2t7_openshift-insights(77fd9062-0f7d-4255-92ca-7e4325daeddd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"insights-operator-8f89dfddd-6k2t7_openshift-insights(77fd9062-0f7d-4255-92ca-7e4325daeddd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_insights-operator-8f89dfddd-6k2t7_openshift-insights_77fd9062-0f7d-4255-92ca-7e4325daeddd_0(bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6): error adding pod openshift-insights_insights-operator-8f89dfddd-6k2t7 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6\\\" Netns:\\\"/var/run/netns/aac74c3a-4095-447a-9307-a3f08aa837e3\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-insights;K8S_POD_NAME=insights-operator-8f89dfddd-6k2t7;K8S_POD_INFRA_CONTAINER_ID=bde451c5e6f566903bbd584d02dc6d8b5f7474876a5ac451b28780c2732021d6;K8S_POD_UID=77fd9062-0f7d-4255-92ca-7e4325daeddd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-insights/insights-operator-8f89dfddd-6k2t7] networking: Multus: [openshift-insights/insights-operator-8f89dfddd-6k2t7/77fd9062-0f7d-4255-92ca-7e4325daeddd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: SetNetworkStatus: failed to update the pod insights-operator-8f89dfddd-6k2t7 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-6k2t7?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" podUID="77fd9062-0f7d-4255-92ca-7e4325daeddd" Mar 13 01:16:42.747251 master-0 kubenswrapper[7110]: E0313 01:16:42.747186 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:16:42.747251 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02" Netns:"/var/run/netns/16176860-1f22-48ce-8af2-6d4bc2a413ca" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.747251 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.747251 master-0 kubenswrapper[7110]: > Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: E0313 01:16:42.747297 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02" Netns:"/var/run/netns/16176860-1f22-48ce-8af2-6d4bc2a413ca" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: > pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: E0313 01:16:42.747327 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02" Netns:"/var/run/netns/16176860-1f22-48ce-8af2-6d4bc2a413ca" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569" Path:"" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.747401 master-0 kubenswrapper[7110]: > pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:16:42.747858 master-0 kubenswrapper[7110]: E0313 01:16:42.747427 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api(d278ed70-786c-4b6c-9f04-f08ede704569)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api(d278ed70-786c-4b6c-9f04-f08ede704569)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-autoscaler-operator-69576476f7-2q4qb_openshift-machine-api_d278ed70-786c-4b6c-9f04-f08ede704569_0(18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02): error adding pod openshift-machine-api_cluster-autoscaler-operator-69576476f7-2q4qb to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02\\\" Netns:\\\"/var/run/netns/16176860-1f22-48ce-8af2-6d4bc2a413ca\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=cluster-autoscaler-operator-69576476f7-2q4qb;K8S_POD_INFRA_CONTAINER_ID=18da12c9e3e7024082299762be3734cb8dcb2be9b756bb3c695481369bee2d02;K8S_POD_UID=d278ed70-786c-4b6c-9f04-f08ede704569\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb] networking: Multus: [openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb/d278ed70-786c-4b6c-9f04-f08ede704569]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: SetNetworkStatus: failed to update the pod cluster-autoscaler-operator-69576476f7-2q4qb in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-2q4qb?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" podUID="d278ed70-786c-4b6c-9f04-f08ede704569" Mar 13 01:16:42.753697 master-0 kubenswrapper[7110]: E0313 01:16:42.753643 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:16:42.753697 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4" Netns:"/var/run/netns/626fa865-ab46-466b-af4e-e7792794db3b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.753697 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.753697 master-0 kubenswrapper[7110]: > Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: E0313 01:16:42.753723 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4" Netns:"/var/run/netns/626fa865-ab46-466b-af4e-e7792794db3b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: E0313 01:16:42.753751 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4" Netns:"/var/run/netns/626fa865-ab46-466b-af4e-e7792794db3b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:16:42.753915 master-0 kubenswrapper[7110]: E0313 01:16:42.753827 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-cpp59_openshift-marketplace(c3ae16e5-ba77-427f-b85f-5b354e7bfb9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-cpp59_openshift-marketplace(c3ae16e5-ba77-427f-b85f-5b354e7bfb9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-cpp59_openshift-marketplace_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d_0(df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4): error adding pod openshift-marketplace_community-operators-cpp59 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4\\\" Netns:\\\"/var/run/netns/626fa865-ab46-466b-af4e-e7792794db3b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-cpp59;K8S_POD_INFRA_CONTAINER_ID=df65e310326b5cb9baa98331b328f9e59d2ac5b27ae9dabb79a85611af61baa4;K8S_POD_UID=c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-cpp59] networking: Multus: [openshift-marketplace/community-operators-cpp59/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-cpp59 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-cpp59 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cpp59?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-cpp59" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" Mar 13 01:16:42.762988 master-0 kubenswrapper[7110]: E0313 01:16:42.762921 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:16:42.762988 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515" Netns:"/var/run/netns/bf3e4265-5eb7-46d3-a567-f33f085d4230" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.762988 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.762988 master-0 kubenswrapper[7110]: > Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: E0313 01:16:42.763025 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515" Netns:"/var/run/netns/bf3e4265-5eb7-46d3-a567-f33f085d4230" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: E0313 01:16:42.763062 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515" Netns:"/var/run/netns/bf3e4265-5eb7-46d3-a567-f33f085d4230" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:16:42.763109 master-0 kubenswrapper[7110]: > pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:16:42.763306 master-0 kubenswrapper[7110]: E0313 01:16:42.763196 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-9zvz2_openshift-marketplace(d23bbaec-b635-4649-b26e-2829f32d21f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-9zvz2_openshift-marketplace(d23bbaec-b635-4649-b26e-2829f32d21f0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-9zvz2_openshift-marketplace_d23bbaec-b635-4649-b26e-2829f32d21f0_0(9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515): error adding pod openshift-marketplace_certified-operators-9zvz2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515\\\" Netns:\\\"/var/run/netns/bf3e4265-5eb7-46d3-a567-f33f085d4230\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-9zvz2;K8S_POD_INFRA_CONTAINER_ID=9b229e13cf12a5658c559db48deae653fe95f3b9e9594456754225dae6a8b515;K8S_POD_UID=d23bbaec-b635-4649-b26e-2829f32d21f0\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-9zvz2] networking: Multus: [openshift-marketplace/certified-operators-9zvz2/d23bbaec-b635-4649-b26e-2829f32d21f0]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-9zvz2 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-9zvz2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9zvz2?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-9zvz2" podUID="d23bbaec-b635-4649-b26e-2829f32d21f0" Mar 13 01:16:42.768166 master-0 kubenswrapper[7110]: I0313 01:16:42.768113 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:42.768229 master-0 kubenswrapper[7110]: I0313 01:16:42.768180 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:43.146526 master-0 kubenswrapper[7110]: I0313 01:16:43.146446 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:16:43.146526 master-0 kubenswrapper[7110]: I0313 01:16:43.146516 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:16:43.147339 master-0 kubenswrapper[7110]: I0313 01:16:43.146523 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:16:43.147339 master-0 kubenswrapper[7110]: I0313 01:16:43.146469 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:16:43.147339 master-0 kubenswrapper[7110]: I0313 01:16:43.147235 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:16:43.147571 master-0 kubenswrapper[7110]: I0313 01:16:43.147529 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:16:43.147755 master-0 kubenswrapper[7110]: I0313 01:16:43.147724 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:16:43.147999 master-0 kubenswrapper[7110]: I0313 01:16:43.147958 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:16:44.066666 master-0 kubenswrapper[7110]: I0313 01:16:44.062539 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:16:44.066666 master-0 kubenswrapper[7110]: I0313 01:16:44.062662 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:16:44.066666 master-0 kubenswrapper[7110]: I0313 01:16:44.062317 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:16:44.066666 master-0 kubenswrapper[7110]: I0313 01:16:44.065718 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:16:44.155823 master-0 kubenswrapper[7110]: I0313 01:16:44.155774 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/0.log" Mar 13 01:16:44.156787 master-0 kubenswrapper[7110]: I0313 01:16:44.156292 7110 generic.go:334] "Generic (PLEG): container finished" podID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerID="123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9" exitCode=1 Mar 13 01:16:45.768813 master-0 kubenswrapper[7110]: I0313 01:16:45.768743 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:45.769870 master-0 kubenswrapper[7110]: I0313 01:16:45.768849 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:46.626327 master-0 kubenswrapper[7110]: I0313 01:16:46.626237 7110 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-2wh5w container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.42:8081/readyz\": dial tcp 10.128.0.42:8081: connect: connection refused" start-of-body= Mar 13 01:16:46.626616 master-0 kubenswrapper[7110]: I0313 01:16:46.626327 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" podUID="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.42:8081/readyz\": dial tcp 10.128.0.42:8081: connect: connection refused" Mar 13 01:16:48.767758 master-0 kubenswrapper[7110]: I0313 01:16:48.767628 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:16:48.767758 master-0 kubenswrapper[7110]: I0313 01:16:48.767758 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:49.193935 master-0 kubenswrapper[7110]: I0313 01:16:49.193792 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/1.log" Mar 13 01:16:49.194929 master-0 kubenswrapper[7110]: I0313 01:16:49.194888 7110 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd" exitCode=255 Mar 13 01:16:50.768090 master-0 kubenswrapper[7110]: I0313 01:16:50.767979 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:16:50.768090 master-0 kubenswrapper[7110]: I0313 01:16:50.768080 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:16:50.966327 master-0 kubenswrapper[7110]: I0313 01:16:50.966243 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:16:50.966594 master-0 kubenswrapper[7110]: I0313 01:16:50.966324 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:16:52.533563 master-0 kubenswrapper[7110]: I0313 01:16:52.533477 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:16:52.534385 master-0 kubenswrapper[7110]: I0313 01:16:52.533576 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:16:52.955341 master-0 kubenswrapper[7110]: E0313 01:16:52.955230 7110 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:16:52.955625 master-0 kubenswrapper[7110]: E0313 01:16:52.955555 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Mar 13 01:16:52.955625 master-0 kubenswrapper[7110]: I0313 01:16:52.955604 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:16:52.955876 master-0 kubenswrapper[7110]: I0313 01:16:52.955746 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:16:52.958768 master-0 kubenswrapper[7110]: I0313 01:16:52.958108 7110 scope.go:117] "RemoveContainer" containerID="4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd" Mar 13 01:16:52.958768 master-0 kubenswrapper[7110]: I0313 01:16:52.958363 7110 scope.go:117] "RemoveContainer" containerID="e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6" Mar 13 01:16:52.959002 master-0 kubenswrapper[7110]: I0313 01:16:52.958796 7110 scope.go:117] "RemoveContainer" containerID="bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268" Mar 13 01:16:52.960483 master-0 kubenswrapper[7110]: I0313 01:16:52.959121 7110 scope.go:117] "RemoveContainer" containerID="5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b" Mar 13 01:16:52.960483 master-0 kubenswrapper[7110]: I0313 01:16:52.960246 7110 scope.go:117] "RemoveContainer" containerID="85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43" Mar 13 01:16:52.972105 master-0 kubenswrapper[7110]: I0313 01:16:52.972039 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:16:53.223709 master-0 kubenswrapper[7110]: I0313 01:16:53.223664 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-kdn2l_70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/ingress-operator/0.log" Mar 13 01:16:53.364386 master-0 kubenswrapper[7110]: E0313 01:16:53.364162 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:16:43Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:16:43Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:16:43Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:16:43Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e9ee63a30a9b95b5801afa36e09fc583ec2cda3c5cb3c8676e478fea016abfa1\\\"],\\\"sizeBytes\\\":470680779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebee49810f493f9b566740bd61256fd40b897cc51423f1efa01a02bb57ce177d\\\"],\\\"sizeBytes\\\":467234714},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:16:53.766034 master-0 kubenswrapper[7110]: E0313 01:16:53.765966 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:16:54.062042 master-0 kubenswrapper[7110]: I0313 01:16:54.061827 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:16:54.062042 master-0 kubenswrapper[7110]: I0313 01:16:54.061918 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:16:54.062042 master-0 kubenswrapper[7110]: I0313 01:16:54.061923 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:16:54.062042 master-0 kubenswrapper[7110]: I0313 01:16:54.061969 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:16:54.234694 master-0 kubenswrapper[7110]: I0313 01:16:54.234620 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/0.log" Mar 13 01:16:54.243567 master-0 kubenswrapper[7110]: I0313 01:16:54.243490 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/0.log" Mar 13 01:16:54.247002 master-0 kubenswrapper[7110]: I0313 01:16:54.246935 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/1.log" Mar 13 01:16:56.103596 master-0 kubenswrapper[7110]: E0313 01:16:56.103329 7110 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{installer-6-master-0.189c41995ef21f05 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:installer-6-master-0,UID:0b3a64f4-e94f-4916-8c91-a255d987735d,APIVersion:v1,ResourceVersion:9580,FieldPath:spec.containers{installer},},Reason:Created,Message:Created container: installer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:14:40.255860485 +0000 UTC m=+81.540886951,LastTimestamp:2026-03-13 01:14:40.255860485 +0000 UTC m=+81.540886951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:17:00.965354 master-0 kubenswrapper[7110]: I0313 01:17:00.965304 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:17:00.965912 master-0 kubenswrapper[7110]: I0313 01:17:00.965360 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:17:02.532704 master-0 kubenswrapper[7110]: I0313 01:17:02.532572 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:17:02.533868 master-0 kubenswrapper[7110]: I0313 01:17:02.532744 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:17:02.533868 master-0 kubenswrapper[7110]: I0313 01:17:02.532577 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:17:02.533868 master-0 kubenswrapper[7110]: I0313 01:17:02.532880 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:17:03.365473 master-0 kubenswrapper[7110]: E0313 01:17:03.365372 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:17:04.061652 master-0 kubenswrapper[7110]: I0313 01:17:04.061572 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:17:04.062468 master-0 kubenswrapper[7110]: I0313 01:17:04.061735 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:17:04.062677 master-0 kubenswrapper[7110]: I0313 01:17:04.062565 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:17:04.062775 master-0 kubenswrapper[7110]: I0313 01:17:04.062701 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:17:05.967567 master-0 kubenswrapper[7110]: E0313 01:17:05.967500 7110 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 01:17:10.767827 master-0 kubenswrapper[7110]: E0313 01:17:10.767742 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:17:10.965222 master-0 kubenswrapper[7110]: I0313 01:17:10.965149 7110 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-bxqp2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 13 01:17:10.965517 master-0 kubenswrapper[7110]: I0313 01:17:10.965233 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 13 01:17:12.533335 master-0 kubenswrapper[7110]: I0313 01:17:12.533280 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:17:12.534360 master-0 kubenswrapper[7110]: I0313 01:17:12.534059 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:17:13.366308 master-0 kubenswrapper[7110]: E0313 01:17:13.366196 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:17:14.062558 master-0 kubenswrapper[7110]: I0313 01:17:14.062493 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:17:14.063412 master-0 kubenswrapper[7110]: I0313 01:17:14.062591 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:17:14.403717 master-0 kubenswrapper[7110]: I0313 01:17:14.403570 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/0.log" Mar 13 01:17:14.403717 master-0 kubenswrapper[7110]: I0313 01:17:14.403619 7110 generic.go:334] "Generic (PLEG): container finished" podID="8c6bf2d5-1881-4b63-b247-7e7426707fa1" containerID="4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d" exitCode=1 Mar 13 01:17:20.444012 master-0 kubenswrapper[7110]: I0313 01:17:20.443944 7110 generic.go:334] "Generic (PLEG): container finished" podID="1308fba1-a50d-48b3-b272-7bef44727b7f" containerID="0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad" exitCode=0 Mar 13 01:17:21.455318 master-0 kubenswrapper[7110]: I0313 01:17:21.455252 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" exitCode=1 Mar 13 01:17:22.533425 master-0 kubenswrapper[7110]: I0313 01:17:22.533333 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:17:22.534371 master-0 kubenswrapper[7110]: I0313 01:17:22.533445 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/readyz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:17:22.534371 master-0 kubenswrapper[7110]: I0313 01:17:22.533457 7110 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-7fc8j container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" start-of-body= Mar 13 01:17:22.534371 master-0 kubenswrapper[7110]: I0313 01:17:22.533586 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" podUID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.41:8081/healthz\": dial tcp 10.128.0.41:8081: connect: connection refused" Mar 13 01:17:23.367768 master-0 kubenswrapper[7110]: E0313 01:17:23.367696 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:17:23.472375 master-0 kubenswrapper[7110]: I0313 01:17:23.472332 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/1.log" Mar 13 01:17:23.473346 master-0 kubenswrapper[7110]: I0313 01:17:23.473285 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/0.log" Mar 13 01:17:23.473495 master-0 kubenswrapper[7110]: I0313 01:17:23.473362 7110 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869" exitCode=1 Mar 13 01:17:23.475913 master-0 kubenswrapper[7110]: I0313 01:17:23.475867 7110 generic.go:334] "Generic (PLEG): container finished" podID="631f5719-2083-4c99-92cb-2ddc04022d86" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" exitCode=0 Mar 13 01:17:23.703332 master-0 kubenswrapper[7110]: I0313 01:17:23.703219 7110 patch_prober.go:28] interesting pod/controller-manager-757fb68448-cj9p5 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Mar 13 01:17:23.704157 master-0 kubenswrapper[7110]: I0313 01:17:23.703330 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Mar 13 01:17:23.704157 master-0 kubenswrapper[7110]: I0313 01:17:23.703346 7110 patch_prober.go:28] interesting pod/controller-manager-757fb68448-cj9p5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Mar 13 01:17:23.704157 master-0 kubenswrapper[7110]: I0313 01:17:23.703434 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Mar 13 01:17:24.061983 master-0 kubenswrapper[7110]: I0313 01:17:24.061814 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:17:24.061983 master-0 kubenswrapper[7110]: I0313 01:17:24.061895 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:17:26.975026 master-0 kubenswrapper[7110]: E0313 01:17:26.974909 7110 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 01:17:26.976284 master-0 kubenswrapper[7110]: E0313 01:17:26.975182 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Mar 13 01:17:26.976284 master-0 kubenswrapper[7110]: I0313 01:17:26.975221 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7"} Mar 13 01:17:26.976284 master-0 kubenswrapper[7110]: I0313 01:17:26.975332 7110 scope.go:117] "RemoveContainer" containerID="61dfbb39eadee51c94c1da7b2c82616d9472cbfa81dbb07d96ebc8cbcec88cf7" Mar 13 01:17:26.980336 master-0 kubenswrapper[7110]: I0313 01:17:26.979795 7110 scope.go:117] "RemoveContainer" containerID="123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9" Mar 13 01:17:26.980336 master-0 kubenswrapper[7110]: I0313 01:17:26.980116 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:26.980336 master-0 kubenswrapper[7110]: I0313 01:17:26.980255 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:26.992333 master-0 kubenswrapper[7110]: I0313 01:17:26.991587 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:17:27.508747 master-0 kubenswrapper[7110]: I0313 01:17:27.508571 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/0.log" Mar 13 01:17:27.511972 master-0 kubenswrapper[7110]: I0313 01:17:27.511916 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/1.log" Mar 13 01:17:27.769090 master-0 kubenswrapper[7110]: E0313 01:17:27.768759 7110 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:17:28.768494 master-0 kubenswrapper[7110]: I0313 01:17:28.768399 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:28.769366 master-0 kubenswrapper[7110]: I0313 01:17:28.768517 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:29.768611 master-0 kubenswrapper[7110]: I0313 01:17:29.768526 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:29.768611 master-0 kubenswrapper[7110]: I0313 01:17:29.768608 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:30.107423 master-0 kubenswrapper[7110]: E0313 01:17:30.107081 7110 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{installer-6-master-0.189c4199605ad469 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:installer-6-master-0,UID:0b3a64f4-e94f-4916-8c91-a255d987735d,APIVersion:v1,ResourceVersion:9580,FieldPath:spec.containers{installer},},Reason:Started,Message:Started container installer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:14:40.279499881 +0000 UTC m=+81.564526347,LastTimestamp:2026-03-13 01:14:40.279499881 +0000 UTC m=+81.564526347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:17:31.768395 master-0 kubenswrapper[7110]: I0313 01:17:31.768294 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:31.768395 master-0 kubenswrapper[7110]: I0313 01:17:31.768374 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:32.768729 master-0 kubenswrapper[7110]: I0313 01:17:32.768600 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:32.769706 master-0 kubenswrapper[7110]: I0313 01:17:32.768729 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:33.368834 master-0 kubenswrapper[7110]: E0313 01:17:33.368730 7110 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:17:33.368834 master-0 kubenswrapper[7110]: E0313 01:17:33.368780 7110 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 01:17:33.703400 master-0 kubenswrapper[7110]: I0313 01:17:33.703322 7110 patch_prober.go:28] interesting pod/controller-manager-757fb68448-cj9p5 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Mar 13 01:17:33.703677 master-0 kubenswrapper[7110]: I0313 01:17:33.703409 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Mar 13 01:17:33.703677 master-0 kubenswrapper[7110]: I0313 01:17:33.703481 7110 patch_prober.go:28] interesting pod/controller-manager-757fb68448-cj9p5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Mar 13 01:17:33.703836 master-0 kubenswrapper[7110]: I0313 01:17:33.703613 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Mar 13 01:17:34.061604 master-0 kubenswrapper[7110]: I0313 01:17:34.061417 7110 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-dszg5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" start-of-body= Mar 13 01:17:34.061604 master-0 kubenswrapper[7110]: I0313 01:17:34.061513 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" podUID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.12:8080/healthz\": dial tcp 10.128.0.12:8080: connect: connection refused" Mar 13 01:17:34.768457 master-0 kubenswrapper[7110]: I0313 01:17:34.768360 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:34.768457 master-0 kubenswrapper[7110]: I0313 01:17:34.768449 7110 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:35.768182 master-0 kubenswrapper[7110]: I0313 01:17:35.768068 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:35.768182 master-0 kubenswrapper[7110]: I0313 01:17:35.768162 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:38.341921 master-0 kubenswrapper[7110]: W0313 01:17:38.341843 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd278ed70_786c_4b6c_9f04_f08ede704569.slice/crio-9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da WatchSource:0}: Error finding container 9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da: Status 404 returned error can't find the container with id 9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da Mar 13 01:17:38.352617 master-0 kubenswrapper[7110]: E0313 01:17:38.352555 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="11.377s" Mar 13 01:17:38.352809 master-0 kubenswrapper[7110]: I0313 01:17:38.352626 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 01:17:38.352809 master-0 kubenswrapper[7110]: I0313 01:17:38.352702 7110 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="1ccd6ccc-58cf-4081-9031-8ebcc9e5cb31" Mar 13 01:17:38.352809 master-0 kubenswrapper[7110]: I0313 01:17:38.352738 7110 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" containerID="cri-o://4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd" Mar 13 01:17:38.352809 master-0 kubenswrapper[7110]: I0313 01:17:38.352755 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:38.352809 master-0 kubenswrapper[7110]: I0313 01:17:38.352783 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:17:38.355893 master-0 kubenswrapper[7110]: I0313 01:17:38.355852 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:38.356173 master-0 kubenswrapper[7110]: I0313 01:17:38.356138 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:38.356303 master-0 kubenswrapper[7110]: I0313 01:17:38.356240 7110 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 13 01:17:38.356372 master-0 kubenswrapper[7110]: I0313 01:17:38.356340 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" containerID="cri-o://09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600" gracePeriod=30 Mar 13 01:17:38.356625 master-0 kubenswrapper[7110]: I0313 01:17:38.356606 7110 scope.go:117] "RemoveContainer" containerID="a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05" Mar 13 01:17:38.356835 master-0 kubenswrapper[7110]: I0313 01:17:38.356793 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:17:38.357554 master-0 kubenswrapper[7110]: E0313 01:17:38.357519 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 13 01:17:38.362290 master-0 kubenswrapper[7110]: I0313 01:17:38.362185 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=181.362161329 podStartE2EDuration="3m1.362161329s" podCreationTimestamp="2026-03-13 01:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:17:38.351148879 +0000 UTC m=+259.636175395" watchObservedRunningTime="2026-03-13 01:17:38.362161329 +0000 UTC m=+259.647187805" Mar 13 01:17:38.370409 master-0 kubenswrapper[7110]: I0313 01:17:38.370374 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 01:17:38.380986 master-0 kubenswrapper[7110]: I0313 01:17:38.380891 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb"] Mar 13 01:17:38.380986 master-0 kubenswrapper[7110]: I0313 01:17:38.380942 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 01:17:38.380986 master-0 kubenswrapper[7110]: I0313 01:17:38.380954 7110 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="1ccd6ccc-58cf-4081-9031-8ebcc9e5cb31" Mar 13 01:17:38.380986 master-0 kubenswrapper[7110]: I0313 01:17:38.380977 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.380993 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd"} Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381014 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381032 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g"] Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381059 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-6k2t7"] Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381084 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381103 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381120 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381130 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381143 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerDied","Data":"167e9a0418be9c64d38402cc471015911f91f7d101628f86049fb49485d8495a"} Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381160 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerDied","Data":"4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24"} Mar 13 01:17:38.381168 master-0 kubenswrapper[7110]: I0313 01:17:38.381177 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381189 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381200 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381211 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381221 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerDied","Data":"71b98806c78a21853872bf216fdc04280da7bf4d8777bb06b2a922047a6a9e8c"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381234 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerDied","Data":"a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381247 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381262 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerDied","Data":"18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381275 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerDied","Data":"c60b1887494d08fb5df2490e135e1d701bdbe7b6a6e136c3d75f17211fbf551b"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381288 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerDied","Data":"0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381312 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerDied","Data":"451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381338 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerDied","Data":"8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381356 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerDied","Data":"d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381378 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381395 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"99dc9aacbe53f2463fbb1d6c45782c44f72e7b13c67642bb7d0b4839b16638fe"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381410 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381430 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerDied","Data":"bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381448 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerStarted","Data":"62d2b92f5805220707a2be14d18659481c24419c7f112e9e794398a7182f05dd"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381462 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerStarted","Data":"4f15414965bd82c8ca70ef81dc14cd39c996705ef15d04f56d1a5564c3ede50b"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381476 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381492 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerStarted","Data":"bc64b6329645b722c1cc45a7bfce3843288d247124cc2d19dde983135ddcc23b"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381505 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerDied","Data":"6b3d1f96b7eda0842ce0b60c494ed28d5b1988f57c59ae6dc2d45944467711cc"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381522 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b3d1f96b7eda0842ce0b60c494ed28d5b1988f57c59ae6dc2d45944467711cc" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381538 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerStarted","Data":"eeb920b84acc3688525f08752f9228e88cce15d298099682713138fc0275698d"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381552 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerStarted","Data":"291d7f234c91da53973dfeea878525a20e6b8e9491a00bb5aa5e9bac339b437f"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381565 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerStarted","Data":"3386a3e179b760a480294045f0ff532d50506e45395937cbdf6059ad9ea50ed9"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381579 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerDied","Data":"13bf43dd31255e64913f1edd8b9b049ea7f9baf74595bd3516213a0e530b536c"} Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381592 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13bf43dd31255e64913f1edd8b9b049ea7f9baf74595bd3516213a0e530b536c" Mar 13 01:17:38.381574 master-0 kubenswrapper[7110]: I0313 01:17:38.381604 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381619 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381664 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerDied","Data":"6b91817f5b3f7a0651a092d44f47916346942c3944860ca84cb9f688537c7ce3"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381687 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerDied","Data":"a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381702 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerDied","Data":"5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381720 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerDied","Data":"123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381739 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381771 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"71f9d1850bad5c508f54de1c0bfe33c2b618025214e8583b970eda19de8409dc"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381788 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.381805 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerStarted","Data":"d3bb469eb3f63fc3e5e3d196d0736a8219372d03c2b49b9f26be6e3281573d4c"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382566 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"7ef67c4dbd8426a3e3af7aa349a5cbaefed2fde80e4c7f48ba81fe002ea31f34"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382592 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382605 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382618 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382685 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382698 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e"} Mar 13 01:17:38.382702 master-0 kubenswrapper[7110]: I0313 01:17:38.382710 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b"} Mar 13 01:17:38.392356 master-0 kubenswrapper[7110]: I0313 01:17:38.392304 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerDied","Data":"4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d"} Mar 13 01:17:38.392454 master-0 kubenswrapper[7110]: I0313 01:17:38.383958 7110 scope.go:117] "RemoveContainer" containerID="029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006" Mar 13 01:17:38.392454 master-0 kubenswrapper[7110]: I0313 01:17:38.392402 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerDied","Data":"0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad"} Mar 13 01:17:38.392454 master-0 kubenswrapper[7110]: I0313 01:17:38.387130 7110 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d"} pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 13 01:17:38.392652 master-0 kubenswrapper[7110]: I0313 01:17:38.392486 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" podUID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerName="authentication-operator" containerID="cri-o://70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d" gracePeriod=30 Mar 13 01:17:38.392652 master-0 kubenswrapper[7110]: I0313 01:17:38.392430 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36"} Mar 13 01:17:38.392652 master-0 kubenswrapper[7110]: I0313 01:17:38.392617 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869"} Mar 13 01:17:38.392848 master-0 kubenswrapper[7110]: I0313 01:17:38.392661 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerDied","Data":"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d"} Mar 13 01:17:38.392848 master-0 kubenswrapper[7110]: I0313 01:17:38.392683 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"dbffdb32298050e3d786bea05b0e0e1b7922cd3d84a8dd8e9be8f2f907195c49"} Mar 13 01:17:38.393016 master-0 kubenswrapper[7110]: W0313 01:17:38.387148 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd23bbaec_b635_4649_b26e_2829f32d21f0.slice/crio-2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9 WatchSource:0}: Error finding container 2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9: Status 404 returned error can't find the container with id 2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9 Mar 13 01:17:38.393117 master-0 kubenswrapper[7110]: I0313 01:17:38.393022 7110 scope.go:117] "RemoveContainer" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" Mar 13 01:17:38.393579 master-0 kubenswrapper[7110]: I0313 01:17:38.393564 7110 scope.go:117] "RemoveContainer" containerID="0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad" Mar 13 01:17:38.393950 master-0 kubenswrapper[7110]: I0313 01:17:38.393933 7110 scope.go:117] "RemoveContainer" containerID="63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869" Mar 13 01:17:38.394302 master-0 kubenswrapper[7110]: I0313 01:17:38.394288 7110 scope.go:117] "RemoveContainer" containerID="4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d" Mar 13 01:17:38.394968 master-0 kubenswrapper[7110]: I0313 01:17:38.394933 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:17:38.399233 master-0 kubenswrapper[7110]: I0313 01:17:38.399178 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zvz2"] Mar 13 01:17:38.399978 master-0 kubenswrapper[7110]: I0313 01:17:38.399932 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" podStartSLOduration=164.309002014 podStartE2EDuration="3m3.399923518s" podCreationTimestamp="2026-03-13 01:14:35 +0000 UTC" firstStartedPulling="2026-03-13 01:14:36.499994031 +0000 UTC m=+77.785020497" lastFinishedPulling="2026-03-13 01:14:55.590915505 +0000 UTC m=+96.875942001" observedRunningTime="2026-03-13 01:17:38.395065526 +0000 UTC m=+259.680092022" watchObservedRunningTime="2026-03-13 01:17:38.399923518 +0000 UTC m=+259.684949984" Mar 13 01:17:38.411801 master-0 kubenswrapper[7110]: I0313 01:17:38.411755 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:17:38.433049 master-0 kubenswrapper[7110]: I0313 01:17:38.432398 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-29mns" podStartSLOduration=156.645980531 podStartE2EDuration="3m2.432375539s" podCreationTimestamp="2026-03-13 01:14:36 +0000 UTC" firstStartedPulling="2026-03-13 01:14:38.09218958 +0000 UTC m=+79.377216036" lastFinishedPulling="2026-03-13 01:15:03.878584538 +0000 UTC m=+105.163611044" observedRunningTime="2026-03-13 01:17:38.425319518 +0000 UTC m=+259.710345994" watchObservedRunningTime="2026-03-13 01:17:38.432375539 +0000 UTC m=+259.717402005" Mar 13 01:17:38.475261 master-0 kubenswrapper[7110]: I0313 01:17:38.474872 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" podStartSLOduration=183.166750818 podStartE2EDuration="3m7.474855621s" podCreationTimestamp="2026-03-13 01:14:31 +0000 UTC" firstStartedPulling="2026-03-13 01:14:35.522341784 +0000 UTC m=+76.807368250" lastFinishedPulling="2026-03-13 01:14:39.830446587 +0000 UTC m=+81.115473053" observedRunningTime="2026-03-13 01:17:38.447749187 +0000 UTC m=+259.732775673" watchObservedRunningTime="2026-03-13 01:17:38.474855621 +0000 UTC m=+259.759882087" Mar 13 01:17:38.497900 master-0 kubenswrapper[7110]: I0313 01:17:38.492252 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" podStartSLOduration=180.105483875 podStartE2EDuration="3m4.492223882s" podCreationTimestamp="2026-03-13 01:14:34 +0000 UTC" firstStartedPulling="2026-03-13 01:14:35.429648912 +0000 UTC m=+76.714675378" lastFinishedPulling="2026-03-13 01:14:39.816388919 +0000 UTC m=+81.101415385" observedRunningTime="2026-03-13 01:17:38.487860183 +0000 UTC m=+259.772886659" watchObservedRunningTime="2026-03-13 01:17:38.492223882 +0000 UTC m=+259.777250348" Mar 13 01:17:38.501014 master-0 kubenswrapper[7110]: I0313 01:17:38.500910 7110 scope.go:117] "RemoveContainer" containerID="578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee" Mar 13 01:17:38.508187 master-0 kubenswrapper[7110]: I0313 01:17:38.508126 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" podStartSLOduration=178.497110527 podStartE2EDuration="3m2.508108352s" podCreationTimestamp="2026-03-13 01:14:36 +0000 UTC" firstStartedPulling="2026-03-13 01:14:37.933673558 +0000 UTC m=+79.218700014" lastFinishedPulling="2026-03-13 01:14:41.944671373 +0000 UTC m=+83.229697839" observedRunningTime="2026-03-13 01:17:38.506583825 +0000 UTC m=+259.791610291" watchObservedRunningTime="2026-03-13 01:17:38.508108352 +0000 UTC m=+259.793134818" Mar 13 01:17:38.530139 master-0 kubenswrapper[7110]: I0313 01:17:38.530071 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lb5rz" podStartSLOduration=156.649600161 podStartE2EDuration="3m2.530053027s" podCreationTimestamp="2026-03-13 01:14:36 +0000 UTC" firstStartedPulling="2026-03-13 01:14:38.055676659 +0000 UTC m=+79.340703125" lastFinishedPulling="2026-03-13 01:15:03.936129515 +0000 UTC m=+105.221155991" observedRunningTime="2026-03-13 01:17:38.526862212 +0000 UTC m=+259.811888688" watchObservedRunningTime="2026-03-13 01:17:38.530053027 +0000 UTC m=+259.815079493" Mar 13 01:17:38.567439 master-0 kubenswrapper[7110]: I0313 01:17:38.567293 7110 scope.go:117] "RemoveContainer" containerID="e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6" Mar 13 01:17:38.597851 master-0 kubenswrapper[7110]: I0313 01:17:38.597792 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:17:38.606490 master-0 kubenswrapper[7110]: I0313 01:17:38.606226 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:17:38.606490 master-0 kubenswrapper[7110]: E0313 01:17:38.606411 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 13 01:17:38.608591 master-0 kubenswrapper[7110]: I0313 01:17:38.608067 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerStarted","Data":"2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9"} Mar 13 01:17:38.610596 master-0 kubenswrapper[7110]: I0313 01:17:38.608865 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" event={"ID":"2ce47660-f7cc-4669-a00d-83422f0f6d55","Type":"ContainerStarted","Data":"50afa0bafdfdadd430cb50b2aa81b0c11200da9c802e7cb966b1902e4941db5a"} Mar 13 01:17:38.610596 master-0 kubenswrapper[7110]: I0313 01:17:38.609404 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da"} Mar 13 01:17:38.610596 master-0 kubenswrapper[7110]: I0313 01:17:38.610060 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" event={"ID":"77fd9062-0f7d-4255-92ca-7e4325daeddd","Type":"ContainerStarted","Data":"9abb90df1fb36f7d743ddb849ea400a46f15eae6ffadde3a44f5e1ad0528227b"} Mar 13 01:17:38.611264 master-0 kubenswrapper[7110]: I0313 01:17:38.611230 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 01:17:38.611449 master-0 kubenswrapper[7110]: I0313 01:17:38.611410 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-bxqp2_f9b713fb-64ce-4a01-951c-1f31df62e1ae/authentication-operator/1.log" Mar 13 01:17:38.611913 master-0 kubenswrapper[7110]: I0313 01:17:38.611874 7110 generic.go:334] "Generic (PLEG): container finished" podID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerID="70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d" exitCode=255 Mar 13 01:17:38.611979 master-0 kubenswrapper[7110]: I0313 01:17:38.611952 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerDied","Data":"70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d"} Mar 13 01:17:38.613671 master-0 kubenswrapper[7110]: I0313 01:17:38.613615 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerStarted","Data":"59133a9d647f486a8926b8884bea8f34510b68fe3f879499fe545b7adae05ea4"} Mar 13 01:17:38.614237 master-0 kubenswrapper[7110]: I0313 01:17:38.614177 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:17:38.615327 master-0 kubenswrapper[7110]: I0313 01:17:38.614440 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:17:38.616296 master-0 kubenswrapper[7110]: I0313 01:17:38.615914 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:17:38.616296 master-0 kubenswrapper[7110]: I0313 01:17:38.616004 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:17:38.671656 master-0 kubenswrapper[7110]: I0313 01:17:38.671558 7110 scope.go:117] "RemoveContainer" containerID="029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006" Mar 13 01:17:38.673159 master-0 kubenswrapper[7110]: E0313 01:17:38.673060 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006\": container with ID starting with 029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006 not found: ID does not exist" containerID="029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006" Mar 13 01:17:38.673159 master-0 kubenswrapper[7110]: I0313 01:17:38.673090 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006"} err="failed to get container status \"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006\": rpc error: code = NotFound desc = could not find container \"029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006\": container with ID starting with 029d789d642119db799ad9f74a4764d435ebea90c5fcd58eb9670acf9f27e006 not found: ID does not exist" Mar 13 01:17:38.673159 master-0 kubenswrapper[7110]: I0313 01:17:38.673112 7110 scope.go:117] "RemoveContainer" containerID="578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee" Mar 13 01:17:38.673894 master-0 kubenswrapper[7110]: E0313 01:17:38.673709 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee\": container with ID starting with 578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee not found: ID does not exist" containerID="578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee" Mar 13 01:17:38.673894 master-0 kubenswrapper[7110]: I0313 01:17:38.673727 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee"} err="failed to get container status \"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee\": rpc error: code = NotFound desc = could not find container \"578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee\": container with ID starting with 578253e69f6e09bf433efeaf51fe4123dcb489a1bcfe88e292f1fbad219b25ee not found: ID does not exist" Mar 13 01:17:38.673894 master-0 kubenswrapper[7110]: I0313 01:17:38.673740 7110 scope.go:117] "RemoveContainer" containerID="e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6" Mar 13 01:17:38.674250 master-0 kubenswrapper[7110]: E0313 01:17:38.674048 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6\": container with ID starting with e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6 not found: ID does not exist" containerID="e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6" Mar 13 01:17:38.674250 master-0 kubenswrapper[7110]: I0313 01:17:38.674067 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6"} err="failed to get container status \"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6\": rpc error: code = NotFound desc = could not find container \"e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6\": container with ID starting with e6af91761c3978d4ee9c9c1a1feccd940ab86be07c53b40abdaf52c178494ea6 not found: ID does not exist" Mar 13 01:17:38.674250 master-0 kubenswrapper[7110]: I0313 01:17:38.674079 7110 scope.go:117] "RemoveContainer" containerID="c60b1887494d08fb5df2490e135e1d701bdbe7b6a6e136c3d75f17211fbf551b" Mar 13 01:17:38.753608 master-0 kubenswrapper[7110]: I0313 01:17:38.753558 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.753543171 podStartE2EDuration="753.543171ms" podCreationTimestamp="2026-03-13 01:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:17:38.752320321 +0000 UTC m=+260.037346787" watchObservedRunningTime="2026-03-13 01:17:38.753543171 +0000 UTC m=+260.038569637" Mar 13 01:17:38.767956 master-0 kubenswrapper[7110]: I0313 01:17:38.767920 7110 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 01:17:38.768089 master-0 kubenswrapper[7110]: I0313 01:17:38.767982 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 01:17:38.915577 master-0 kubenswrapper[7110]: I0313 01:17:38.915548 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415ee541-898b-41d1-98b0-c5e622776590" path="/var/lib/kubelet/pods/415ee541-898b-41d1-98b0-c5e622776590/volumes" Mar 13 01:17:38.950586 master-0 kubenswrapper[7110]: I0313 01:17:38.950539 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0b3a64f4-e94f-4916-8c91-a255d987735d/installer/0.log" Mar 13 01:17:38.950710 master-0 kubenswrapper[7110]: I0313 01:17:38.950661 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:17:38.965922 master-0 kubenswrapper[7110]: I0313 01:17:38.965891 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access\") pod \"0b3a64f4-e94f-4916-8c91-a255d987735d\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " Mar 13 01:17:38.966384 master-0 kubenswrapper[7110]: I0313 01:17:38.965964 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock\") pod \"0b3a64f4-e94f-4916-8c91-a255d987735d\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " Mar 13 01:17:38.966384 master-0 kubenswrapper[7110]: I0313 01:17:38.965989 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir\") pod \"0b3a64f4-e94f-4916-8c91-a255d987735d\" (UID: \"0b3a64f4-e94f-4916-8c91-a255d987735d\") " Mar 13 01:17:38.966384 master-0 kubenswrapper[7110]: I0313 01:17:38.966347 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b3a64f4-e94f-4916-8c91-a255d987735d" (UID: "0b3a64f4-e94f-4916-8c91-a255d987735d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:17:38.966384 master-0 kubenswrapper[7110]: I0313 01:17:38.966379 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock" (OuterVolumeSpecName: "var-lock") pod "0b3a64f4-e94f-4916-8c91-a255d987735d" (UID: "0b3a64f4-e94f-4916-8c91-a255d987735d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:17:38.971260 master-0 kubenswrapper[7110]: I0313 01:17:38.971225 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b3a64f4-e94f-4916-8c91-a255d987735d" (UID: "0b3a64f4-e94f-4916-8c91-a255d987735d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:17:39.067315 master-0 kubenswrapper[7110]: I0313 01:17:39.067266 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b3a64f4-e94f-4916-8c91-a255d987735d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:39.068809 master-0 kubenswrapper[7110]: I0313 01:17:39.067582 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:39.068809 master-0 kubenswrapper[7110]: I0313 01:17:39.067620 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b3a64f4-e94f-4916-8c91-a255d987735d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:39.621915 master-0 kubenswrapper[7110]: I0313 01:17:39.621777 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" event={"ID":"2ce47660-f7cc-4669-a00d-83422f0f6d55","Type":"ContainerStarted","Data":"4b6a4d1e81b6f153adb00d6cd286fed3da85b58d04c7b67cb99e1e1a37cd143a"} Mar 13 01:17:39.622599 master-0 kubenswrapper[7110]: I0313 01:17:39.622415 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:17:39.623647 master-0 kubenswrapper[7110]: I0313 01:17:39.623604 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0b3a64f4-e94f-4916-8c91-a255d987735d/installer/0.log" Mar 13 01:17:39.623759 master-0 kubenswrapper[7110]: I0313 01:17:39.623731 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:17:39.625508 master-0 kubenswrapper[7110]: I0313 01:17:39.624602 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerDied","Data":"555967ac1c8966a10024222c10bd15df837fc12752f180fd00e26584a6a7eadd"} Mar 13 01:17:39.625508 master-0 kubenswrapper[7110]: I0313 01:17:39.624649 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="555967ac1c8966a10024222c10bd15df837fc12752f180fd00e26584a6a7eadd" Mar 13 01:17:39.633387 master-0 kubenswrapper[7110]: I0313 01:17:39.630057 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:17:39.633387 master-0 kubenswrapper[7110]: I0313 01:17:39.632388 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"7d069f7cf40ce00e10c1f0f6baa994ec7a0d37d154f8f16c691fae327fe2644d"} Mar 13 01:17:39.633387 master-0 kubenswrapper[7110]: I0313 01:17:39.632668 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:17:39.634521 master-0 kubenswrapper[7110]: I0313 01:17:39.634487 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"e1abc99b61310b8078d12894917e6a71003139ee52ec277667d925e2d84f6589"} Mar 13 01:17:39.638431 master-0 kubenswrapper[7110]: I0313 01:17:39.638022 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:17:39.638589 master-0 kubenswrapper[7110]: I0313 01:17:39.638551 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-bxqp2_f9b713fb-64ce-4a01-951c-1f31df62e1ae/authentication-operator/1.log" Mar 13 01:17:39.638712 master-0 kubenswrapper[7110]: I0313 01:17:39.638686 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"a45dea6369e1284e4d6cd6145c197a33c24d0024d01b01a27e517f9663b1c0a5"} Mar 13 01:17:39.641095 master-0 kubenswrapper[7110]: I0313 01:17:39.641057 7110 generic.go:334] "Generic (PLEG): container finished" podID="d23bbaec-b635-4649-b26e-2829f32d21f0" containerID="682be9d920ceec9bf69789866cf8eedebb71157fd9c01901ddaedd2fde2be709" exitCode=0 Mar 13 01:17:39.641235 master-0 kubenswrapper[7110]: I0313 01:17:39.641133 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerDied","Data":"682be9d920ceec9bf69789866cf8eedebb71157fd9c01901ddaedd2fde2be709"} Mar 13 01:17:39.643715 master-0 kubenswrapper[7110]: I0313 01:17:39.643604 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/2.log" Mar 13 01:17:39.644455 master-0 kubenswrapper[7110]: I0313 01:17:39.644421 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/1.log" Mar 13 01:17:39.644925 master-0 kubenswrapper[7110]: I0313 01:17:39.644892 7110 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600" exitCode=255 Mar 13 01:17:39.645030 master-0 kubenswrapper[7110]: I0313 01:17:39.644947 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600"} Mar 13 01:17:39.645030 master-0 kubenswrapper[7110]: I0313 01:17:39.644995 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"d06826ec4775f3b8e7bc8c8c50364ebde1bf5d008653ef1eefd79f82a03cb948"} Mar 13 01:17:39.645030 master-0 kubenswrapper[7110]: I0313 01:17:39.645014 7110 scope.go:117] "RemoveContainer" containerID="4c1d61286014ac7f71bd88f3fc7c396a7af1471d4d9a420f8c93c8833b3c7efd" Mar 13 01:17:39.645203 master-0 kubenswrapper[7110]: I0313 01:17:39.645118 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:39.647253 master-0 kubenswrapper[7110]: I0313 01:17:39.647219 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerStarted","Data":"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8"} Mar 13 01:17:39.650629 master-0 kubenswrapper[7110]: I0313 01:17:39.650583 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:17:39.652342 master-0 kubenswrapper[7110]: I0313 01:17:39.652274 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" podStartSLOduration=181.652253399 podStartE2EDuration="3m1.652253399s" podCreationTimestamp="2026-03-13 01:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:17:39.649208853 +0000 UTC m=+260.934235329" watchObservedRunningTime="2026-03-13 01:17:39.652253399 +0000 UTC m=+260.937279885" Mar 13 01:17:39.656130 master-0 kubenswrapper[7110]: I0313 01:17:39.656095 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/0.log" Mar 13 01:17:39.656268 master-0 kubenswrapper[7110]: I0313 01:17:39.656165 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963"} Mar 13 01:17:39.656515 master-0 kubenswrapper[7110]: I0313 01:17:39.656482 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:17:39.659740 master-0 kubenswrapper[7110]: I0313 01:17:39.659696 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"f3373c04dfe5b06ce5689672c5fa9716a2e2ff1f88c17517721cb216726a9cc3"} Mar 13 01:17:39.662538 master-0 kubenswrapper[7110]: I0313 01:17:39.662491 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/1.log" Mar 13 01:17:39.662782 master-0 kubenswrapper[7110]: I0313 01:17:39.662747 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459"} Mar 13 01:17:39.664798 master-0 kubenswrapper[7110]: I0313 01:17:39.664760 7110 generic.go:334] "Generic (PLEG): container finished" podID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerID="663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f" exitCode=0 Mar 13 01:17:39.664958 master-0 kubenswrapper[7110]: I0313 01:17:39.664918 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerDied","Data":"663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f"} Mar 13 01:17:39.665502 master-0 kubenswrapper[7110]: I0313 01:17:39.665454 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:17:39.665814 master-0 kubenswrapper[7110]: E0313 01:17:39.665770 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 13 01:17:40.276921 master-0 kubenswrapper[7110]: I0313 01:17:40.276861 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:40.672870 master-0 kubenswrapper[7110]: I0313 01:17:40.672839 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/2.log" Mar 13 01:17:40.674900 master-0 kubenswrapper[7110]: I0313 01:17:40.674772 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:17:40.675175 master-0 kubenswrapper[7110]: E0313 01:17:40.675137 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(f78c05e1499b533b83f091333d61f045)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" Mar 13 01:17:41.682908 master-0 kubenswrapper[7110]: I0313 01:17:41.682843 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"85a02a7779e2a01415e0d12cd3a306e827d6428398eea489dff1c9d2909a65c4"} Mar 13 01:17:41.685109 master-0 kubenswrapper[7110]: I0313 01:17:41.685059 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" event={"ID":"77fd9062-0f7d-4255-92ca-7e4325daeddd","Type":"ContainerStarted","Data":"56e23dc047d0c9af7251a6f497704d60dfa2828d26b6a71ca2f42af20d7203ee"} Mar 13 01:17:41.711463 master-0 kubenswrapper[7110]: I0313 01:17:41.711379 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" podStartSLOduration=181.525209693 podStartE2EDuration="3m3.711356416s" podCreationTimestamp="2026-03-13 01:14:38 +0000 UTC" firstStartedPulling="2026-03-13 01:17:38.678407003 +0000 UTC m=+259.963433469" lastFinishedPulling="2026-03-13 01:17:40.864553696 +0000 UTC m=+262.149580192" observedRunningTime="2026-03-13 01:17:41.708228606 +0000 UTC m=+262.993255102" watchObservedRunningTime="2026-03-13 01:17:41.711356416 +0000 UTC m=+262.996382922" Mar 13 01:17:41.729338 master-0 kubenswrapper[7110]: I0313 01:17:41.729241 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" podStartSLOduration=181.227724577 podStartE2EDuration="3m3.729215751s" podCreationTimestamp="2026-03-13 01:14:38 +0000 UTC" firstStartedPulling="2026-03-13 01:17:38.3882144 +0000 UTC m=+259.673240886" lastFinishedPulling="2026-03-13 01:17:40.889705564 +0000 UTC m=+262.174732060" observedRunningTime="2026-03-13 01:17:41.727284273 +0000 UTC m=+263.012310749" watchObservedRunningTime="2026-03-13 01:17:41.729215751 +0000 UTC m=+263.014242227" Mar 13 01:17:43.151048 master-0 kubenswrapper[7110]: I0313 01:17:43.150231 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 01:17:43.151048 master-0 kubenswrapper[7110]: I0313 01:17:43.150274 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 01:17:43.171178 master-0 kubenswrapper[7110]: I0313 01:17:43.170886 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 01:17:43.721058 master-0 kubenswrapper[7110]: I0313 01:17:43.720927 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 01:17:44.771732 master-0 kubenswrapper[7110]: I0313 01:17:44.770868 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:17:45.711476 master-0 kubenswrapper[7110]: I0313 01:17:45.711387 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerStarted","Data":"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377"} Mar 13 01:17:45.715289 master-0 kubenswrapper[7110]: I0313 01:17:45.715250 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerStarted","Data":"2a303738597489cb37ad3f267fff17f1f61cb2e88b4598e4de81e45d9cdb8d55"} Mar 13 01:17:45.742996 master-0 kubenswrapper[7110]: I0313 01:17:45.742929 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:17:45.754652 master-0 kubenswrapper[7110]: I0313 01:17:45.754585 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:17:45.755000 master-0 kubenswrapper[7110]: I0313 01:17:45.754962 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-29mns" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="registry-server" containerID="cri-o://6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3" gracePeriod=2 Mar 13 01:17:46.239321 master-0 kubenswrapper[7110]: I0313 01:17:46.239220 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:17:46.265888 master-0 kubenswrapper[7110]: I0313 01:17:46.265818 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities\") pod \"19d78757-5081-4711-992e-c8fd7891f9b7\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " Mar 13 01:17:46.266053 master-0 kubenswrapper[7110]: I0313 01:17:46.266006 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content\") pod \"19d78757-5081-4711-992e-c8fd7891f9b7\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " Mar 13 01:17:46.266149 master-0 kubenswrapper[7110]: I0313 01:17:46.266088 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdpfx\" (UniqueName: \"kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx\") pod \"19d78757-5081-4711-992e-c8fd7891f9b7\" (UID: \"19d78757-5081-4711-992e-c8fd7891f9b7\") " Mar 13 01:17:46.267507 master-0 kubenswrapper[7110]: I0313 01:17:46.267444 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities" (OuterVolumeSpecName: "utilities") pod "19d78757-5081-4711-992e-c8fd7891f9b7" (UID: "19d78757-5081-4711-992e-c8fd7891f9b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:46.272519 master-0 kubenswrapper[7110]: I0313 01:17:46.272453 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx" (OuterVolumeSpecName: "kube-api-access-jdpfx") pod "19d78757-5081-4711-992e-c8fd7891f9b7" (UID: "19d78757-5081-4711-992e-c8fd7891f9b7"). InnerVolumeSpecName "kube-api-access-jdpfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:17:46.317656 master-0 kubenswrapper[7110]: I0313 01:17:46.317575 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19d78757-5081-4711-992e-c8fd7891f9b7" (UID: "19d78757-5081-4711-992e-c8fd7891f9b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:46.368111 master-0 kubenswrapper[7110]: I0313 01:17:46.368048 7110 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:46.368111 master-0 kubenswrapper[7110]: I0313 01:17:46.368100 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdpfx\" (UniqueName: \"kubernetes.io/projected/19d78757-5081-4711-992e-c8fd7891f9b7-kube-api-access-jdpfx\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:46.368111 master-0 kubenswrapper[7110]: I0313 01:17:46.368122 7110 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d78757-5081-4711-992e-c8fd7891f9b7-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:46.467070 master-0 kubenswrapper[7110]: I0313 01:17:46.467019 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:46.467208 master-0 kubenswrapper[7110]: I0313 01:17:46.467088 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:46.467858 master-0 kubenswrapper[7110]: I0313 01:17:46.467801 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:17:46.731717 master-0 kubenswrapper[7110]: I0313 01:17:46.731612 7110 generic.go:334] "Generic (PLEG): container finished" podID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerID="57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377" exitCode=0 Mar 13 01:17:46.731885 master-0 kubenswrapper[7110]: I0313 01:17:46.731766 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerDied","Data":"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377"} Mar 13 01:17:46.736675 master-0 kubenswrapper[7110]: I0313 01:17:46.736573 7110 generic.go:334] "Generic (PLEG): container finished" podID="d23bbaec-b635-4649-b26e-2829f32d21f0" containerID="2a303738597489cb37ad3f267fff17f1f61cb2e88b4598e4de81e45d9cdb8d55" exitCode=0 Mar 13 01:17:46.736811 master-0 kubenswrapper[7110]: I0313 01:17:46.736731 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerDied","Data":"2a303738597489cb37ad3f267fff17f1f61cb2e88b4598e4de81e45d9cdb8d55"} Mar 13 01:17:46.746005 master-0 kubenswrapper[7110]: I0313 01:17:46.743943 7110 generic.go:334] "Generic (PLEG): container finished" podID="19d78757-5081-4711-992e-c8fd7891f9b7" containerID="6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3" exitCode=0 Mar 13 01:17:46.746005 master-0 kubenswrapper[7110]: I0313 01:17:46.744136 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerDied","Data":"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3"} Mar 13 01:17:46.746005 master-0 kubenswrapper[7110]: I0313 01:17:46.744258 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-29mns" event={"ID":"19d78757-5081-4711-992e-c8fd7891f9b7","Type":"ContainerDied","Data":"e8a5ab75c1b7f63476173d508b85b9a4a92039b2b0c3f241f48d8c8d79ae1573"} Mar 13 01:17:46.746005 master-0 kubenswrapper[7110]: I0313 01:17:46.744308 7110 scope.go:117] "RemoveContainer" containerID="6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3" Mar 13 01:17:46.746005 master-0 kubenswrapper[7110]: I0313 01:17:46.744150 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-29mns" Mar 13 01:17:46.768613 master-0 kubenswrapper[7110]: I0313 01:17:46.768559 7110 scope.go:117] "RemoveContainer" containerID="925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3" Mar 13 01:17:46.790359 master-0 kubenswrapper[7110]: I0313 01:17:46.789978 7110 scope.go:117] "RemoveContainer" containerID="e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde" Mar 13 01:17:46.817483 master-0 kubenswrapper[7110]: I0313 01:17:46.817432 7110 scope.go:117] "RemoveContainer" containerID="6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: E0313 01:17:46.819102 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3\": container with ID starting with 6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3 not found: ID does not exist" containerID="6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: I0313 01:17:46.819213 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3"} err="failed to get container status \"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3\": rpc error: code = NotFound desc = could not find container \"6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3\": container with ID starting with 6b5a8fa25db81892bded208f2b2aac20234244be88b5c809ea3ce651bf11e2b3 not found: ID does not exist" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: I0313 01:17:46.819264 7110 scope.go:117] "RemoveContainer" containerID="925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: E0313 01:17:46.819710 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3\": container with ID starting with 925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3 not found: ID does not exist" containerID="925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: I0313 01:17:46.819748 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3"} err="failed to get container status \"925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3\": rpc error: code = NotFound desc = could not find container \"925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3\": container with ID starting with 925a29571691633af51df81c409f158e86cc5df954d25c8014b22843fdfaa4d3 not found: ID does not exist" Mar 13 01:17:46.821208 master-0 kubenswrapper[7110]: I0313 01:17:46.819775 7110 scope.go:117] "RemoveContainer" containerID="e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde" Mar 13 01:17:46.832751 master-0 kubenswrapper[7110]: I0313 01:17:46.832679 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:17:46.837831 master-0 kubenswrapper[7110]: E0313 01:17:46.837759 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde\": container with ID starting with e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde not found: ID does not exist" containerID="e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde" Mar 13 01:17:46.838067 master-0 kubenswrapper[7110]: I0313 01:17:46.837851 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde"} err="failed to get container status \"e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde\": rpc error: code = NotFound desc = could not find container \"e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde\": container with ID starting with e3fcb49d939f028bc20635235a331a97f2b6dc3a0a1be0538871f8f8226d3dde not found: ID does not exist" Mar 13 01:17:46.838220 master-0 kubenswrapper[7110]: I0313 01:17:46.838151 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-29mns"] Mar 13 01:17:46.919995 master-0 kubenswrapper[7110]: I0313 01:17:46.919922 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" path="/var/lib/kubelet/pods/19d78757-5081-4711-992e-c8fd7891f9b7/volumes" Mar 13 01:17:47.267137 master-0 kubenswrapper[7110]: I0313 01:17:47.267094 7110 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:47.760104 master-0 kubenswrapper[7110]: I0313 01:17:47.760052 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerStarted","Data":"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c"} Mar 13 01:17:47.760354 master-0 kubenswrapper[7110]: I0313 01:17:47.760233 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cpp59" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="registry-server" containerID="cri-o://1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c" gracePeriod=2 Mar 13 01:17:47.766899 master-0 kubenswrapper[7110]: I0313 01:17:47.766185 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"ae1c74ac713339ebe951cea485ddb317986dccb644eb4d3021ce0d21c709fe41"} Mar 13 01:17:47.770682 master-0 kubenswrapper[7110]: I0313 01:17:47.770132 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerStarted","Data":"63e66c703747bc64e98b5e738eda042e23712aeb8b127f226dc9a93942823bdc"} Mar 13 01:17:47.785759 master-0 kubenswrapper[7110]: I0313 01:17:47.784715 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cpp59" podStartSLOduration=181.274519155 podStartE2EDuration="3m8.784695425s" podCreationTimestamp="2026-03-13 01:14:39 +0000 UTC" firstStartedPulling="2026-03-13 01:17:39.666674997 +0000 UTC m=+260.951701473" lastFinishedPulling="2026-03-13 01:17:47.176851247 +0000 UTC m=+268.461877743" observedRunningTime="2026-03-13 01:17:47.783441102 +0000 UTC m=+269.068467598" watchObservedRunningTime="2026-03-13 01:17:47.784695425 +0000 UTC m=+269.069721911" Mar 13 01:17:47.834824 master-0 kubenswrapper[7110]: I0313 01:17:47.834693 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zvz2" podStartSLOduration=182.288826099 podStartE2EDuration="3m9.834665152s" podCreationTimestamp="2026-03-13 01:14:38 +0000 UTC" firstStartedPulling="2026-03-13 01:17:39.642527275 +0000 UTC m=+260.927553751" lastFinishedPulling="2026-03-13 01:17:47.188366298 +0000 UTC m=+268.473392804" observedRunningTime="2026-03-13 01:17:47.833547292 +0000 UTC m=+269.118573778" watchObservedRunningTime="2026-03-13 01:17:47.834665152 +0000 UTC m=+269.119691648" Mar 13 01:17:48.122270 master-0 kubenswrapper[7110]: I0313 01:17:48.119295 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:17:48.122270 master-0 kubenswrapper[7110]: I0313 01:17:48.119718 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lb5rz" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="registry-server" containerID="cri-o://be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c" gracePeriod=2 Mar 13 01:17:48.273529 master-0 kubenswrapper[7110]: I0313 01:17:48.273489 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cpp59_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d/registry-server/0.log" Mar 13 01:17:48.274984 master-0 kubenswrapper[7110]: I0313 01:17:48.274954 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:17:48.292742 master-0 kubenswrapper[7110]: I0313 01:17:48.292683 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities\") pod \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " Mar 13 01:17:48.292884 master-0 kubenswrapper[7110]: I0313 01:17:48.292775 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content\") pod \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " Mar 13 01:17:48.292884 master-0 kubenswrapper[7110]: I0313 01:17:48.292815 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2pvt\" (UniqueName: \"kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt\") pod \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\" (UID: \"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d\") " Mar 13 01:17:48.296368 master-0 kubenswrapper[7110]: I0313 01:17:48.294535 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities" (OuterVolumeSpecName: "utilities") pod "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" (UID: "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:48.298692 master-0 kubenswrapper[7110]: I0313 01:17:48.298429 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt" (OuterVolumeSpecName: "kube-api-access-r2pvt") pod "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" (UID: "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d"). InnerVolumeSpecName "kube-api-access-r2pvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:17:48.390785 master-0 kubenswrapper[7110]: I0313 01:17:48.384472 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" (UID: "c3ae16e5-ba77-427f-b85f-5b354e7bfb9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:48.393899 master-0 kubenswrapper[7110]: I0313 01:17:48.393794 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2pvt\" (UniqueName: \"kubernetes.io/projected/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-kube-api-access-r2pvt\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.393899 master-0 kubenswrapper[7110]: I0313 01:17:48.393855 7110 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.393899 master-0 kubenswrapper[7110]: I0313 01:17:48.393884 7110 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.576030 master-0 kubenswrapper[7110]: I0313 01:17:48.575970 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:17:48.576222 master-0 kubenswrapper[7110]: I0313 01:17:48.576045 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:17:48.600420 master-0 kubenswrapper[7110]: I0313 01:17:48.600360 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:17:48.697331 master-0 kubenswrapper[7110]: I0313 01:17:48.697259 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities\") pod \"6e0572c2-78f9-4c65-8ea0-7242236d641f\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " Mar 13 01:17:48.697331 master-0 kubenswrapper[7110]: I0313 01:17:48.697318 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content\") pod \"6e0572c2-78f9-4c65-8ea0-7242236d641f\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " Mar 13 01:17:48.697616 master-0 kubenswrapper[7110]: I0313 01:17:48.697417 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8sjh\" (UniqueName: \"kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh\") pod \"6e0572c2-78f9-4c65-8ea0-7242236d641f\" (UID: \"6e0572c2-78f9-4c65-8ea0-7242236d641f\") " Mar 13 01:17:48.698734 master-0 kubenswrapper[7110]: I0313 01:17:48.698674 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities" (OuterVolumeSpecName: "utilities") pod "6e0572c2-78f9-4c65-8ea0-7242236d641f" (UID: "6e0572c2-78f9-4c65-8ea0-7242236d641f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:48.700743 master-0 kubenswrapper[7110]: I0313 01:17:48.700688 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh" (OuterVolumeSpecName: "kube-api-access-l8sjh") pod "6e0572c2-78f9-4c65-8ea0-7242236d641f" (UID: "6e0572c2-78f9-4c65-8ea0-7242236d641f"). InnerVolumeSpecName "kube-api-access-l8sjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:17:48.781039 master-0 kubenswrapper[7110]: I0313 01:17:48.780949 7110 generic.go:334] "Generic (PLEG): container finished" podID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerID="be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c" exitCode=0 Mar 13 01:17:48.781039 master-0 kubenswrapper[7110]: I0313 01:17:48.780998 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerDied","Data":"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c"} Mar 13 01:17:48.781285 master-0 kubenswrapper[7110]: I0313 01:17:48.781075 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb5rz" event={"ID":"6e0572c2-78f9-4c65-8ea0-7242236d641f","Type":"ContainerDied","Data":"489575f0a1d8a7c8e97c64f0529347190c5df02ad8ff4559632008aa4dd81545"} Mar 13 01:17:48.781285 master-0 kubenswrapper[7110]: I0313 01:17:48.781080 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb5rz" Mar 13 01:17:48.781285 master-0 kubenswrapper[7110]: I0313 01:17:48.781112 7110 scope.go:117] "RemoveContainer" containerID="be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c" Mar 13 01:17:48.784835 master-0 kubenswrapper[7110]: I0313 01:17:48.783130 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cpp59_c3ae16e5-ba77-427f-b85f-5b354e7bfb9d/registry-server/0.log" Mar 13 01:17:48.786123 master-0 kubenswrapper[7110]: I0313 01:17:48.786079 7110 generic.go:334] "Generic (PLEG): container finished" podID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerID="1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c" exitCode=1 Mar 13 01:17:48.786237 master-0 kubenswrapper[7110]: I0313 01:17:48.786155 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerDied","Data":"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c"} Mar 13 01:17:48.786237 master-0 kubenswrapper[7110]: I0313 01:17:48.786191 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpp59" Mar 13 01:17:48.786237 master-0 kubenswrapper[7110]: I0313 01:17:48.786222 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpp59" event={"ID":"c3ae16e5-ba77-427f-b85f-5b354e7bfb9d","Type":"ContainerDied","Data":"59133a9d647f486a8926b8884bea8f34510b68fe3f879499fe545b7adae05ea4"} Mar 13 01:17:48.798784 master-0 kubenswrapper[7110]: I0313 01:17:48.798730 7110 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.798784 master-0 kubenswrapper[7110]: I0313 01:17:48.798760 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8sjh\" (UniqueName: \"kubernetes.io/projected/6e0572c2-78f9-4c65-8ea0-7242236d641f-kube-api-access-l8sjh\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.805307 master-0 kubenswrapper[7110]: I0313 01:17:48.805281 7110 scope.go:117] "RemoveContainer" containerID="cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d" Mar 13 01:17:48.865318 master-0 kubenswrapper[7110]: I0313 01:17:48.865174 7110 scope.go:117] "RemoveContainer" containerID="3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11" Mar 13 01:17:48.886964 master-0 kubenswrapper[7110]: I0313 01:17:48.886864 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e0572c2-78f9-4c65-8ea0-7242236d641f" (UID: "6e0572c2-78f9-4c65-8ea0-7242236d641f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:17:48.892780 master-0 kubenswrapper[7110]: I0313 01:17:48.892739 7110 scope.go:117] "RemoveContainer" containerID="be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c" Mar 13 01:17:48.893345 master-0 kubenswrapper[7110]: E0313 01:17:48.893268 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c\": container with ID starting with be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c not found: ID does not exist" containerID="be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c" Mar 13 01:17:48.893523 master-0 kubenswrapper[7110]: I0313 01:17:48.893343 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c"} err="failed to get container status \"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c\": rpc error: code = NotFound desc = could not find container \"be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c\": container with ID starting with be31ba4eddead7d8e3b0b3bdd7d2ad0690b19683156059f65f839f77219f2b2c not found: ID does not exist" Mar 13 01:17:48.893523 master-0 kubenswrapper[7110]: I0313 01:17:48.893389 7110 scope.go:117] "RemoveContainer" containerID="cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d" Mar 13 01:17:48.894165 master-0 kubenswrapper[7110]: E0313 01:17:48.894077 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d\": container with ID starting with cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d not found: ID does not exist" containerID="cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d" Mar 13 01:17:48.894925 master-0 kubenswrapper[7110]: I0313 01:17:48.894841 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d"} err="failed to get container status \"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d\": rpc error: code = NotFound desc = could not find container \"cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d\": container with ID starting with cad7c2bbdec63fd94a815f3178983b29b92e2c8c43de29f46fbd59efd546835d not found: ID does not exist" Mar 13 01:17:48.895028 master-0 kubenswrapper[7110]: I0313 01:17:48.894935 7110 scope.go:117] "RemoveContainer" containerID="3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11" Mar 13 01:17:48.895809 master-0 kubenswrapper[7110]: E0313 01:17:48.895756 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11\": container with ID starting with 3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11 not found: ID does not exist" containerID="3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11" Mar 13 01:17:48.895809 master-0 kubenswrapper[7110]: I0313 01:17:48.895790 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11"} err="failed to get container status \"3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11\": rpc error: code = NotFound desc = could not find container \"3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11\": container with ID starting with 3baee088b822acbc2a39af564e0c18df01758a5b84d81978b575dc62c0f21b11 not found: ID does not exist" Mar 13 01:17:48.895809 master-0 kubenswrapper[7110]: I0313 01:17:48.895806 7110 scope.go:117] "RemoveContainer" containerID="1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c" Mar 13 01:17:48.900564 master-0 kubenswrapper[7110]: I0313 01:17:48.900501 7110 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0572c2-78f9-4c65-8ea0-7242236d641f-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 01:17:48.912904 master-0 kubenswrapper[7110]: I0313 01:17:48.912852 7110 scope.go:117] "RemoveContainer" containerID="57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377" Mar 13 01:17:48.932941 master-0 kubenswrapper[7110]: I0313 01:17:48.932895 7110 scope.go:117] "RemoveContainer" containerID="663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f" Mar 13 01:17:48.958439 master-0 kubenswrapper[7110]: I0313 01:17:48.958284 7110 scope.go:117] "RemoveContainer" containerID="1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c" Mar 13 01:17:48.959048 master-0 kubenswrapper[7110]: E0313 01:17:48.958998 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c\": container with ID starting with 1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c not found: ID does not exist" containerID="1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c" Mar 13 01:17:48.959113 master-0 kubenswrapper[7110]: I0313 01:17:48.959046 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c"} err="failed to get container status \"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c\": rpc error: code = NotFound desc = could not find container \"1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c\": container with ID starting with 1545fdf5af39a2944f3e7cd0579f8b1db9b5985bad7e37ca2f621f8c78b0ec1c not found: ID does not exist" Mar 13 01:17:48.959113 master-0 kubenswrapper[7110]: I0313 01:17:48.959094 7110 scope.go:117] "RemoveContainer" containerID="57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377" Mar 13 01:17:48.959581 master-0 kubenswrapper[7110]: E0313 01:17:48.959539 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377\": container with ID starting with 57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377 not found: ID does not exist" containerID="57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377" Mar 13 01:17:48.959650 master-0 kubenswrapper[7110]: I0313 01:17:48.959583 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377"} err="failed to get container status \"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377\": rpc error: code = NotFound desc = could not find container \"57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377\": container with ID starting with 57b5861649076d4e56a649be9254ede90a5119605bc661100afb7d41cc654377 not found: ID does not exist" Mar 13 01:17:48.959650 master-0 kubenswrapper[7110]: I0313 01:17:48.959615 7110 scope.go:117] "RemoveContainer" containerID="663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f" Mar 13 01:17:48.960068 master-0 kubenswrapper[7110]: E0313 01:17:48.960030 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f\": container with ID starting with 663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f not found: ID does not exist" containerID="663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f" Mar 13 01:17:48.960068 master-0 kubenswrapper[7110]: I0313 01:17:48.960061 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f"} err="failed to get container status \"663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f\": rpc error: code = NotFound desc = could not find container \"663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f\": container with ID starting with 663503c7edf1dab914be09ff7f72b1010d95be798f9f62497044e238bbcd049f not found: ID does not exist" Mar 13 01:17:48.963716 master-0 kubenswrapper[7110]: I0313 01:17:48.963592 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:17:49.199303 master-0 kubenswrapper[7110]: I0313 01:17:49.198994 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cpp59"] Mar 13 01:17:49.347783 master-0 kubenswrapper[7110]: I0313 01:17:49.347606 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:17:49.551352 master-0 kubenswrapper[7110]: I0313 01:17:49.551281 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lb5rz"] Mar 13 01:17:49.637880 master-0 kubenswrapper[7110]: I0313 01:17:49.637725 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9zvz2" podUID="d23bbaec-b635-4649-b26e-2829f32d21f0" containerName="registry-server" probeResult="failure" output=< Mar 13 01:17:49.637880 master-0 kubenswrapper[7110]: timeout: failed to connect service ":50051" within 1s Mar 13 01:17:49.637880 master-0 kubenswrapper[7110]: > Mar 13 01:17:50.276998 master-0 kubenswrapper[7110]: I0313 01:17:50.276918 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:50.919883 master-0 kubenswrapper[7110]: I0313 01:17:50.919775 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" path="/var/lib/kubelet/pods/6e0572c2-78f9-4c65-8ea0-7242236d641f/volumes" Mar 13 01:17:50.921216 master-0 kubenswrapper[7110]: I0313 01:17:50.921166 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" path="/var/lib/kubelet/pods/c3ae16e5-ba77-427f-b85f-5b354e7bfb9d/volumes" Mar 13 01:17:55.408426 master-0 kubenswrapper[7110]: I0313 01:17:55.408322 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-retry-1-master-0"] Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408703 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408735 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408768 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408785 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408811 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408826 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408850 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408866 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408885 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408902 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408921 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408937 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408955 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.408970 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.408992 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409007 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409031 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409047 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409071 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409087 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409113 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409130 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="extract-utilities" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409152 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409167 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="extract-content" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409188 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409205 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409223 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415ee541-898b-41d1-98b0-c5e622776590" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409238 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="415ee541-898b-41d1-98b0-c5e622776590" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: E0313 01:17:55.409259 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409274 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409492 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409527 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409552 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d78757-5081-4711-992e-c8fd7891f9b7" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409575 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0572c2-78f9-4c65-8ea0-7242236d641f" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409598 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409619 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ae16e5-ba77-427f-b85f-5b354e7bfb9d" containerName="registry-server" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409676 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:17:55.409625 master-0 kubenswrapper[7110]: I0313 01:17:55.409705 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="415ee541-898b-41d1-98b0-c5e622776590" containerName="installer" Mar 13 01:17:55.412863 master-0 kubenswrapper[7110]: I0313 01:17:55.409730 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:17:55.412863 master-0 kubenswrapper[7110]: I0313 01:17:55.410440 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.413818 master-0 kubenswrapper[7110]: I0313 01:17:55.413760 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:17:55.415241 master-0 kubenswrapper[7110]: I0313 01:17:55.415198 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:17:55.427133 master-0 kubenswrapper[7110]: I0313 01:17:55.427050 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-retry-1-master-0"] Mar 13 01:17:55.501772 master-0 kubenswrapper[7110]: I0313 01:17:55.500822 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.501772 master-0 kubenswrapper[7110]: I0313 01:17:55.500872 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.501772 master-0 kubenswrapper[7110]: I0313 01:17:55.500926 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.602775 master-0 kubenswrapper[7110]: I0313 01:17:55.602708 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.602963 master-0 kubenswrapper[7110]: I0313 01:17:55.602787 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.602963 master-0 kubenswrapper[7110]: I0313 01:17:55.602859 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.603028 master-0 kubenswrapper[7110]: I0313 01:17:55.602999 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.604752 master-0 kubenswrapper[7110]: I0313 01:17:55.604719 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.624130 master-0 kubenswrapper[7110]: I0313 01:17:55.624087 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access\") pod \"installer-3-retry-1-master-0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:55.745448 master-0 kubenswrapper[7110]: I0313 01:17:55.745368 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:17:56.363870 master-0 kubenswrapper[7110]: I0313 01:17:56.363811 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-retry-1-master-0"] Mar 13 01:17:56.466756 master-0 kubenswrapper[7110]: I0313 01:17:56.466045 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:56.473122 master-0 kubenswrapper[7110]: I0313 01:17:56.473088 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:56.856342 master-0 kubenswrapper[7110]: I0313 01:17:56.856279 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerStarted","Data":"8803ea2eb582c5693311e889e291d05e3059cc337f89e85079fab8e693f3beb8"} Mar 13 01:17:56.856342 master-0 kubenswrapper[7110]: I0313 01:17:56.856345 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerStarted","Data":"2a77f8e58ee6b4b9d8e8d1a5c1202e86b111b4dbd37bf30068295cac4daecf86"} Mar 13 01:17:56.863620 master-0 kubenswrapper[7110]: I0313 01:17:56.863585 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:17:56.879407 master-0 kubenswrapper[7110]: I0313 01:17:56.879286 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" podStartSLOduration=1.879264528 podStartE2EDuration="1.879264528s" podCreationTimestamp="2026-03-13 01:17:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:17:56.878165129 +0000 UTC m=+278.163191615" watchObservedRunningTime="2026-03-13 01:17:56.879264528 +0000 UTC m=+278.164290984" Mar 13 01:17:58.628565 master-0 kubenswrapper[7110]: I0313 01:17:58.628512 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:17:58.694411 master-0 kubenswrapper[7110]: I0313 01:17:58.694351 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:17:59.554410 master-0 kubenswrapper[7110]: I0313 01:17:59.554337 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k52lh"] Mar 13 01:17:59.556210 master-0 kubenswrapper[7110]: I0313 01:17:59.556153 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.558091 master-0 kubenswrapper[7110]: I0313 01:17:59.558042 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z254g"] Mar 13 01:17:59.561259 master-0 kubenswrapper[7110]: I0313 01:17:59.560303 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.561259 master-0 kubenswrapper[7110]: I0313 01:17:59.560379 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.561259 master-0 kubenswrapper[7110]: I0313 01:17:59.560497 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.561259 master-0 kubenswrapper[7110]: I0313 01:17:59.560552 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6vn\" (UniqueName: \"kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.561953 master-0 kubenswrapper[7110]: I0313 01:17:59.561919 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-56ljs" Mar 13 01:17:59.564419 master-0 kubenswrapper[7110]: I0313 01:17:59.564315 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bbptx"] Mar 13 01:17:59.565279 master-0 kubenswrapper[7110]: I0313 01:17:59.565239 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-5fw2q" Mar 13 01:17:59.566358 master-0 kubenswrapper[7110]: I0313 01:17:59.566302 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.571514 master-0 kubenswrapper[7110]: I0313 01:17:59.571460 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k52lh"] Mar 13 01:17:59.571514 master-0 kubenswrapper[7110]: I0313 01:17:59.571471 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-qcwkf" Mar 13 01:17:59.577732 master-0 kubenswrapper[7110]: I0313 01:17:59.577691 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z254g"] Mar 13 01:17:59.580816 master-0 kubenswrapper[7110]: I0313 01:17:59.580775 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbptx"] Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.661701 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.661830 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.661879 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmkr\" (UniqueName: \"kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.661977 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnml\" (UniqueName: \"kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.662037 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.662088 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.662142 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.662225 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.662615 master-0 kubenswrapper[7110]: I0313 01:17:59.662276 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6vn\" (UniqueName: \"kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.672075 master-0 kubenswrapper[7110]: I0313 01:17:59.663980 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.672075 master-0 kubenswrapper[7110]: I0313 01:17:59.664654 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.693453 master-0 kubenswrapper[7110]: I0313 01:17:59.693376 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6vn\" (UniqueName: \"kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.763500 master-0 kubenswrapper[7110]: I0313 01:17:59.763427 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.763728 master-0 kubenswrapper[7110]: I0313 01:17:59.763684 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.763799 master-0 kubenswrapper[7110]: I0313 01:17:59.763725 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmkr\" (UniqueName: \"kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.764025 master-0 kubenswrapper[7110]: I0313 01:17:59.763973 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.764025 master-0 kubenswrapper[7110]: I0313 01:17:59.763992 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnml\" (UniqueName: \"kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.764123 master-0 kubenswrapper[7110]: I0313 01:17:59.764079 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.764123 master-0 kubenswrapper[7110]: I0313 01:17:59.764118 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.764498 master-0 kubenswrapper[7110]: I0313 01:17:59.764457 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.764728 master-0 kubenswrapper[7110]: I0313 01:17:59.764702 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.764800 master-0 kubenswrapper[7110]: I0313 01:17:59.764759 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.781488 master-0 kubenswrapper[7110]: I0313 01:17:59.781440 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmkr\" (UniqueName: \"kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:17:59.788452 master-0 kubenswrapper[7110]: I0313 01:17:59.788403 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnml\" (UniqueName: \"kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:17:59.948919 master-0 kubenswrapper[7110]: I0313 01:17:59.948854 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:17:59.979108 master-0 kubenswrapper[7110]: I0313 01:17:59.979032 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:18:00.018855 master-0 kubenswrapper[7110]: I0313 01:18:00.018744 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:18:00.417357 master-0 kubenswrapper[7110]: I0313 01:18:00.417251 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k52lh"] Mar 13 01:18:00.420417 master-0 kubenswrapper[7110]: W0313 01:18:00.420361 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9728b4_4e1e_4165_a276_3daa00e95839.slice/crio-4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334 WatchSource:0}: Error finding container 4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334: Status 404 returned error can't find the container with id 4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334 Mar 13 01:18:00.499790 master-0 kubenswrapper[7110]: I0313 01:18:00.499715 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z254g"] Mar 13 01:18:00.502819 master-0 kubenswrapper[7110]: I0313 01:18:00.502763 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bbptx"] Mar 13 01:18:00.507363 master-0 kubenswrapper[7110]: W0313 01:18:00.507296 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c7c6a4_4f5b_4807_932c_1b0f53ceed22.slice/crio-cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac WatchSource:0}: Error finding container cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac: Status 404 returned error can't find the container with id cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac Mar 13 01:18:00.883416 master-0 kubenswrapper[7110]: I0313 01:18:00.883322 7110 generic.go:334] "Generic (PLEG): container finished" podID="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" containerID="76c90bf296df85a0e3ed051135157af3e8cd81617b8acdff6c18242b0b74f386" exitCode=0 Mar 13 01:18:00.885191 master-0 kubenswrapper[7110]: I0313 01:18:00.883818 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerDied","Data":"76c90bf296df85a0e3ed051135157af3e8cd81617b8acdff6c18242b0b74f386"} Mar 13 01:18:00.885191 master-0 kubenswrapper[7110]: I0313 01:18:00.883869 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerStarted","Data":"cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac"} Mar 13 01:18:00.888608 master-0 kubenswrapper[7110]: I0313 01:18:00.888514 7110 generic.go:334] "Generic (PLEG): container finished" podID="3f9728b4-4e1e-4165-a276-3daa00e95839" containerID="aa38a63f384b7da874941350b67401ef32b738eda3d617175799f0520f0661d5" exitCode=0 Mar 13 01:18:00.890306 master-0 kubenswrapper[7110]: I0313 01:18:00.888723 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerDied","Data":"aa38a63f384b7da874941350b67401ef32b738eda3d617175799f0520f0661d5"} Mar 13 01:18:00.890306 master-0 kubenswrapper[7110]: I0313 01:18:00.888791 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerStarted","Data":"4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334"} Mar 13 01:18:00.892047 master-0 kubenswrapper[7110]: I0313 01:18:00.891281 7110 generic.go:334] "Generic (PLEG): container finished" podID="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" containerID="1516e86a1559523ee925b4efda9712253b30e34e1c1002f5e1e5d03874fe6d41" exitCode=0 Mar 13 01:18:00.892047 master-0 kubenswrapper[7110]: I0313 01:18:00.891327 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerDied","Data":"1516e86a1559523ee925b4efda9712253b30e34e1c1002f5e1e5d03874fe6d41"} Mar 13 01:18:00.892047 master-0 kubenswrapper[7110]: I0313 01:18:00.891356 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerStarted","Data":"94e09a1dd75367c606e9c0f6209f6e945683271c1483d15ae30d37382e33a6c7"} Mar 13 01:18:01.903040 master-0 kubenswrapper[7110]: I0313 01:18:01.902947 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerStarted","Data":"e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0"} Mar 13 01:18:02.690688 master-0 kubenswrapper[7110]: E0313 01:18:02.690590 7110 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9728b4_4e1e_4165_a276_3daa00e95839.slice/crio-conmon-e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9728b4_4e1e_4165_a276_3daa00e95839.slice/crio-e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:18:02.918828 master-0 kubenswrapper[7110]: I0313 01:18:02.918725 7110 generic.go:334] "Generic (PLEG): container finished" podID="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" containerID="c90d5f8b2e62149395ea03f276ff599cb9a6c656f64c9d31908bac0077615d31" exitCode=0 Mar 13 01:18:02.919609 master-0 kubenswrapper[7110]: I0313 01:18:02.919051 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerDied","Data":"c90d5f8b2e62149395ea03f276ff599cb9a6c656f64c9d31908bac0077615d31"} Mar 13 01:18:02.923940 master-0 kubenswrapper[7110]: I0313 01:18:02.923878 7110 generic.go:334] "Generic (PLEG): container finished" podID="3f9728b4-4e1e-4165-a276-3daa00e95839" containerID="e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0" exitCode=0 Mar 13 01:18:02.924092 master-0 kubenswrapper[7110]: I0313 01:18:02.924010 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerDied","Data":"e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0"} Mar 13 01:18:02.928019 master-0 kubenswrapper[7110]: I0313 01:18:02.927933 7110 generic.go:334] "Generic (PLEG): container finished" podID="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" containerID="28ba9aa037bbe7233c4f1b03cbf6e53b0a8d15d86593c5a55e6b347f9655f21f" exitCode=0 Mar 13 01:18:02.928143 master-0 kubenswrapper[7110]: I0313 01:18:02.928013 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerDied","Data":"28ba9aa037bbe7233c4f1b03cbf6e53b0a8d15d86593c5a55e6b347f9655f21f"} Mar 13 01:18:03.938304 master-0 kubenswrapper[7110]: I0313 01:18:03.938247 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerStarted","Data":"7b9a1c53b30b114fafac1d461dd3e21daf6901eba384382a18dce7f6c90a33b2"} Mar 13 01:18:03.941404 master-0 kubenswrapper[7110]: I0313 01:18:03.941362 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerStarted","Data":"3119424a083f353a4b6183c0dad15c0796902de158ccd0a6a3f2774dc5ffa101"} Mar 13 01:18:03.944426 master-0 kubenswrapper[7110]: I0313 01:18:03.943885 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerStarted","Data":"b93822c35ff1bf1be734ea3687c5cb02996a6c6f05c19e51ce529ef4bb707376"} Mar 13 01:18:03.984168 master-0 kubenswrapper[7110]: I0313 01:18:03.984096 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z254g" podStartSLOduration=16.230966549 podStartE2EDuration="18.984074248s" podCreationTimestamp="2026-03-13 01:17:45 +0000 UTC" firstStartedPulling="2026-03-13 01:18:00.886875459 +0000 UTC m=+282.171901955" lastFinishedPulling="2026-03-13 01:18:03.639983168 +0000 UTC m=+284.925009654" observedRunningTime="2026-03-13 01:18:03.962427452 +0000 UTC m=+285.247453928" watchObservedRunningTime="2026-03-13 01:18:03.984074248 +0000 UTC m=+285.269100734" Mar 13 01:18:03.993064 master-0 kubenswrapper[7110]: I0313 01:18:03.992933 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bbptx" podStartSLOduration=16.568795836 podStartE2EDuration="18.992905579s" podCreationTimestamp="2026-03-13 01:17:45 +0000 UTC" firstStartedPulling="2026-03-13 01:18:00.897708653 +0000 UTC m=+282.182735169" lastFinishedPulling="2026-03-13 01:18:03.321818416 +0000 UTC m=+284.606844912" observedRunningTime="2026-03-13 01:18:03.982600809 +0000 UTC m=+285.267627285" watchObservedRunningTime="2026-03-13 01:18:03.992905579 +0000 UTC m=+285.277932115" Mar 13 01:18:04.025683 master-0 kubenswrapper[7110]: I0313 01:18:04.023865 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k52lh" podStartSLOduration=13.440045198 podStartE2EDuration="16.023838468s" podCreationTimestamp="2026-03-13 01:17:48 +0000 UTC" firstStartedPulling="2026-03-13 01:18:00.895887805 +0000 UTC m=+282.180914311" lastFinishedPulling="2026-03-13 01:18:03.479681105 +0000 UTC m=+284.764707581" observedRunningTime="2026-03-13 01:18:04.02048461 +0000 UTC m=+285.305511076" watchObservedRunningTime="2026-03-13 01:18:04.023838468 +0000 UTC m=+285.308864954" Mar 13 01:18:06.862873 master-0 kubenswrapper[7110]: I0313 01:18:06.862744 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8"] Mar 13 01:18:06.863398 master-0 kubenswrapper[7110]: I0313 01:18:06.863007 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="kube-rbac-proxy" containerID="cri-o://d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" gracePeriod=30 Mar 13 01:18:06.863398 master-0 kubenswrapper[7110]: I0313 01:18:06.863120 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="machine-approver-controller" containerID="cri-o://30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" gracePeriod=30 Mar 13 01:18:06.895651 master-0 kubenswrapper[7110]: I0313 01:18:06.895032 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m"] Mar 13 01:18:06.897873 master-0 kubenswrapper[7110]: I0313 01:18:06.896241 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:06.897873 master-0 kubenswrapper[7110]: I0313 01:18:06.896374 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-pmkpj"] Mar 13 01:18:06.897873 master-0 kubenswrapper[7110]: I0313 01:18:06.897202 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:06.900407 master-0 kubenswrapper[7110]: I0313 01:18:06.900370 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:18:06.900611 master-0 kubenswrapper[7110]: I0313 01:18:06.900594 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-zdrdj" Mar 13 01:18:06.900897 master-0 kubenswrapper[7110]: I0313 01:18:06.900790 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 01:18:06.910657 master-0 kubenswrapper[7110]: I0313 01:18:06.901026 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 01:18:06.910657 master-0 kubenswrapper[7110]: I0313 01:18:06.901170 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 01:18:06.910657 master-0 kubenswrapper[7110]: I0313 01:18:06.901239 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 01:18:06.910657 master-0 kubenswrapper[7110]: I0313 01:18:06.901169 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sb45h" Mar 13 01:18:06.910657 master-0 kubenswrapper[7110]: I0313 01:18:06.901170 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:18:06.976977 master-0 kubenswrapper[7110]: I0313 01:18:06.976836 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-zm2jl"] Mar 13 01:18:06.977664 master-0 kubenswrapper[7110]: I0313 01:18:06.977618 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:06.988695 master-0 kubenswrapper[7110]: I0313 01:18:06.988426 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6b6t5" Mar 13 01:18:07.008654 master-0 kubenswrapper[7110]: I0313 01:18:06.999415 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-zm2jl"] Mar 13 01:18:07.068957 master-0 kubenswrapper[7110]: I0313 01:18:07.068905 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.069113 master-0 kubenswrapper[7110]: I0313 01:18:07.068960 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.069113 master-0 kubenswrapper[7110]: I0313 01:18:07.068983 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.069113 master-0 kubenswrapper[7110]: I0313 01:18:07.069011 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.069113 master-0 kubenswrapper[7110]: I0313 01:18:07.069036 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt79p\" (UniqueName: \"kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.069113 master-0 kubenswrapper[7110]: I0313 01:18:07.069078 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.069259 master-0 kubenswrapper[7110]: I0313 01:18:07.069110 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.069259 master-0 kubenswrapper[7110]: I0313 01:18:07.069146 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.069259 master-0 kubenswrapper[7110]: I0313 01:18:07.069163 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdx4\" (UniqueName: \"kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.109079 master-0 kubenswrapper[7110]: I0313 01:18:07.109032 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:18:07.169748 master-0 kubenswrapper[7110]: I0313 01:18:07.169708 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.169926 master-0 kubenswrapper[7110]: I0313 01:18:07.169759 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.169926 master-0 kubenswrapper[7110]: I0313 01:18:07.169783 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdx4\" (UniqueName: \"kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.170012 master-0 kubenswrapper[7110]: I0313 01:18:07.169971 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.170083 master-0 kubenswrapper[7110]: I0313 01:18:07.170057 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.170123 master-0 kubenswrapper[7110]: I0313 01:18:07.170097 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170370 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170437 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170473 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt79p\" (UniqueName: \"kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170503 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170533 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170550 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9fb\" (UniqueName: \"kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170600 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170825 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170974 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.171585 master-0 kubenswrapper[7110]: I0313 01:18:07.170984 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.172765 master-0 kubenswrapper[7110]: I0313 01:18:07.172713 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.184142 master-0 kubenswrapper[7110]: I0313 01:18:07.174665 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.187238 master-0 kubenswrapper[7110]: I0313 01:18:07.187198 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdx4\" (UniqueName: \"kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4\") pod \"cluster-cloud-controller-manager-operator-559568b945-lnm8m\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.188892 master-0 kubenswrapper[7110]: I0313 01:18:07.188862 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt79p\" (UniqueName: \"kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.272368 master-0 kubenswrapper[7110]: I0313 01:18:07.272277 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config\") pod \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " Mar 13 01:18:07.272368 master-0 kubenswrapper[7110]: I0313 01:18:07.272349 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls\") pod \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " Mar 13 01:18:07.272368 master-0 kubenswrapper[7110]: I0313 01:18:07.272381 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config\") pod \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " Mar 13 01:18:07.272686 master-0 kubenswrapper[7110]: I0313 01:18:07.272457 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zzf\" (UniqueName: \"kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf\") pod \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\" (UID: \"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7\") " Mar 13 01:18:07.272686 master-0 kubenswrapper[7110]: I0313 01:18:07.272642 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.272686 master-0 kubenswrapper[7110]: I0313 01:18:07.272672 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9fb\" (UniqueName: \"kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.273267 master-0 kubenswrapper[7110]: I0313 01:18:07.273216 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config" (OuterVolumeSpecName: "config") pod "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" (UID: "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:18:07.273374 master-0 kubenswrapper[7110]: I0313 01:18:07.273338 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" (UID: "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:18:07.275058 master-0 kubenswrapper[7110]: I0313 01:18:07.275012 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" (UID: "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:18:07.275645 master-0 kubenswrapper[7110]: I0313 01:18:07.275601 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.276325 master-0 kubenswrapper[7110]: I0313 01:18:07.276280 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf" (OuterVolumeSpecName: "kube-api-access-99zzf") pod "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" (UID: "23fbbe97-906a-4bce-9ab0-bf633d4f9dd7"). InnerVolumeSpecName "kube-api-access-99zzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:18:07.287730 master-0 kubenswrapper[7110]: I0313 01:18:07.287698 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:07.303953 master-0 kubenswrapper[7110]: W0313 01:18:07.303906 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod372c7cb7_30fc_4575_8bb6_b1d68d9ffe68.slice/crio-87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d WatchSource:0}: Error finding container 87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d: Status 404 returned error can't find the container with id 87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d Mar 13 01:18:07.374605 master-0 kubenswrapper[7110]: I0313 01:18:07.374533 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:07.374605 master-0 kubenswrapper[7110]: I0313 01:18:07.374588 7110 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:07.374605 master-0 kubenswrapper[7110]: I0313 01:18:07.374611 7110 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:07.374905 master-0 kubenswrapper[7110]: I0313 01:18:07.374652 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zzf\" (UniqueName: \"kubernetes.io/projected/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7-kube-api-access-99zzf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:07.404362 master-0 kubenswrapper[7110]: I0313 01:18:07.404257 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:18:07.427067 master-0 kubenswrapper[7110]: W0313 01:18:07.427002 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5956ebf_01e4_4d4c_ae6d_b0995905c6d3.slice/crio-ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820 WatchSource:0}: Error finding container ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820: Status 404 returned error can't find the container with id ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820 Mar 13 01:18:07.509774 master-0 kubenswrapper[7110]: I0313 01:18:07.509708 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9fb\" (UniqueName: \"kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.689711 master-0 kubenswrapper[7110]: I0313 01:18:07.688689 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.760885 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-zt229"] Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: E0313 01:18:07.761073 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="machine-approver-controller" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.761084 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="machine-approver-controller" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: E0313 01:18:07.761105 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="kube-rbac-proxy" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.761111 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="kube-rbac-proxy" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.761195 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="machine-approver-controller" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.761211 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerName="kube-rbac-proxy" Mar 13 01:18:07.762675 master-0 kubenswrapper[7110]: I0313 01:18:07.761712 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.771732 master-0 kubenswrapper[7110]: I0313 01:18:07.766870 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-n4b44" Mar 13 01:18:07.771732 master-0 kubenswrapper[7110]: I0313 01:18:07.767074 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 01:18:07.771732 master-0 kubenswrapper[7110]: I0313 01:18:07.767224 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 01:18:07.773082 master-0 kubenswrapper[7110]: I0313 01:18:07.772892 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 01:18:07.785922 master-0 kubenswrapper[7110]: I0313 01:18:07.783919 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-zt229"] Mar 13 01:18:07.879685 master-0 kubenswrapper[7110]: I0313 01:18:07.879639 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqsdm\" (UniqueName: \"kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.879685 master-0 kubenswrapper[7110]: I0313 01:18:07.879690 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.880245 master-0 kubenswrapper[7110]: I0313 01:18:07.879723 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.880245 master-0 kubenswrapper[7110]: I0313 01:18:07.879744 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.984014 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsdm\" (UniqueName: \"kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.984072 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.984112 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.984139 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.984997 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.986474 master-0 kubenswrapper[7110]: I0313 01:18:07.985400 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.988534 master-0 kubenswrapper[7110]: I0313 01:18:07.988486 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:07.993734 master-0 kubenswrapper[7110]: I0313 01:18:07.993700 7110 generic.go:334] "Generic (PLEG): container finished" podID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerID="30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" exitCode=0 Mar 13 01:18:07.993855 master-0 kubenswrapper[7110]: I0313 01:18:07.993842 7110 generic.go:334] "Generic (PLEG): container finished" podID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" containerID="d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" exitCode=0 Mar 13 01:18:07.993937 master-0 kubenswrapper[7110]: I0313 01:18:07.993762 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerDied","Data":"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94"} Mar 13 01:18:07.993974 master-0 kubenswrapper[7110]: I0313 01:18:07.993953 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerDied","Data":"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214"} Mar 13 01:18:07.994005 master-0 kubenswrapper[7110]: I0313 01:18:07.993786 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" Mar 13 01:18:07.994099 master-0 kubenswrapper[7110]: I0313 01:18:07.994072 7110 scope.go:117] "RemoveContainer" containerID="30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" Mar 13 01:18:08.000820 master-0 kubenswrapper[7110]: I0313 01:18:08.000755 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsdm\" (UniqueName: \"kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:08.012689 master-0 kubenswrapper[7110]: I0313 01:18:07.993969 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8" event={"ID":"23fbbe97-906a-4bce-9ab0-bf633d4f9dd7","Type":"ContainerDied","Data":"1f0f66c570b42159d22d17bc71dbe363049699dde742de12df165fbf93335972"} Mar 13 01:18:08.012689 master-0 kubenswrapper[7110]: I0313 01:18:08.003413 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerStarted","Data":"87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d"} Mar 13 01:18:08.012689 master-0 kubenswrapper[7110]: I0313 01:18:08.003436 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"884a792f041730857627043a59c1f19997609417d4c5da58d81f2f5237f075b1"} Mar 13 01:18:08.012689 master-0 kubenswrapper[7110]: I0313 01:18:08.003456 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"ecb5a113058bb964fe9c155a5ae981bb13b52682de0601fdacf2ecfeb3ca0ddc"} Mar 13 01:18:08.012689 master-0 kubenswrapper[7110]: I0313 01:18:08.003527 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820"} Mar 13 01:18:08.021795 master-0 kubenswrapper[7110]: I0313 01:18:08.021760 7110 scope.go:117] "RemoveContainer" containerID="d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" Mar 13 01:18:08.030666 master-0 kubenswrapper[7110]: I0313 01:18:08.029411 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" podStartSLOduration=2.029396096 podStartE2EDuration="2.029396096s" podCreationTimestamp="2026-03-13 01:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:08.026410607 +0000 UTC m=+289.311437083" watchObservedRunningTime="2026-03-13 01:18:08.029396096 +0000 UTC m=+289.314422562" Mar 13 01:18:08.045980 master-0 kubenswrapper[7110]: I0313 01:18:08.045926 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8"] Mar 13 01:18:08.049916 master-0 kubenswrapper[7110]: I0313 01:18:08.048187 7110 scope.go:117] "RemoveContainer" containerID="30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" Mar 13 01:18:08.051550 master-0 kubenswrapper[7110]: E0313 01:18:08.051393 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94\": container with ID starting with 30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94 not found: ID does not exist" containerID="30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" Mar 13 01:18:08.051550 master-0 kubenswrapper[7110]: I0313 01:18:08.051432 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94"} err="failed to get container status \"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94\": rpc error: code = NotFound desc = could not find container \"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94\": container with ID starting with 30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94 not found: ID does not exist" Mar 13 01:18:08.051550 master-0 kubenswrapper[7110]: I0313 01:18:08.051458 7110 scope.go:117] "RemoveContainer" containerID="d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" Mar 13 01:18:08.051822 master-0 kubenswrapper[7110]: E0313 01:18:08.051748 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214\": container with ID starting with d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214 not found: ID does not exist" containerID="d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" Mar 13 01:18:08.051822 master-0 kubenswrapper[7110]: I0313 01:18:08.051767 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214"} err="failed to get container status \"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214\": rpc error: code = NotFound desc = could not find container \"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214\": container with ID starting with d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214 not found: ID does not exist" Mar 13 01:18:08.051822 master-0 kubenswrapper[7110]: I0313 01:18:08.051780 7110 scope.go:117] "RemoveContainer" containerID="30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94" Mar 13 01:18:08.052054 master-0 kubenswrapper[7110]: I0313 01:18:08.051989 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94"} err="failed to get container status \"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94\": rpc error: code = NotFound desc = could not find container \"30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94\": container with ID starting with 30d64955a2beef5aa415495589d7999948fcd62753769f6a0658edbc22d9fc94 not found: ID does not exist" Mar 13 01:18:08.052054 master-0 kubenswrapper[7110]: I0313 01:18:08.052010 7110 scope.go:117] "RemoveContainer" containerID="d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214" Mar 13 01:18:08.052209 master-0 kubenswrapper[7110]: I0313 01:18:08.052181 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214"} err="failed to get container status \"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214\": rpc error: code = NotFound desc = could not find container \"d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214\": container with ID starting with d17ca8713f562a177a0df2d89877ffc2f34db776972d8955d2c457d6bbc16214 not found: ID does not exist" Mar 13 01:18:08.055056 master-0 kubenswrapper[7110]: I0313 01:18:08.054835 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-jvdz8"] Mar 13 01:18:08.109650 master-0 kubenswrapper[7110]: I0313 01:18:08.100012 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:18:08.110667 master-0 kubenswrapper[7110]: I0313 01:18:08.110625 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8"] Mar 13 01:18:08.111542 master-0 kubenswrapper[7110]: I0313 01:18:08.111527 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.117649 master-0 kubenswrapper[7110]: I0313 01:18:08.114337 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 01:18:08.117649 master-0 kubenswrapper[7110]: I0313 01:18:08.114577 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 01:18:08.117649 master-0 kubenswrapper[7110]: I0313 01:18:08.115414 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 01:18:08.117649 master-0 kubenswrapper[7110]: I0313 01:18:08.115572 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 01:18:08.117649 master-0 kubenswrapper[7110]: I0313 01:18:08.115740 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-bclr4" Mar 13 01:18:08.123645 master-0 kubenswrapper[7110]: I0313 01:18:08.119596 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 01:18:08.207068 master-0 kubenswrapper[7110]: I0313 01:18:08.206975 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-zm2jl"] Mar 13 01:18:08.289354 master-0 kubenswrapper[7110]: I0313 01:18:08.289267 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.289354 master-0 kubenswrapper[7110]: I0313 01:18:08.289330 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-898lt\" (UniqueName: \"kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.289892 master-0 kubenswrapper[7110]: I0313 01:18:08.289362 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.290226 master-0 kubenswrapper[7110]: I0313 01:18:08.290086 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.391882 master-0 kubenswrapper[7110]: I0313 01:18:08.391780 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.391882 master-0 kubenswrapper[7110]: I0313 01:18:08.391841 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.391882 master-0 kubenswrapper[7110]: I0313 01:18:08.391876 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898lt\" (UniqueName: \"kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.392100 master-0 kubenswrapper[7110]: I0313 01:18:08.391896 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.396481 master-0 kubenswrapper[7110]: I0313 01:18:08.395300 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.397608 master-0 kubenswrapper[7110]: I0313 01:18:08.397572 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.401513 master-0 kubenswrapper[7110]: I0313 01:18:08.401313 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.417264 master-0 kubenswrapper[7110]: I0313 01:18:08.415654 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898lt\" (UniqueName: \"kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.465893 master-0 kubenswrapper[7110]: I0313 01:18:08.465710 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-bxqp2_f9b713fb-64ce-4a01-951c-1f31df62e1ae/authentication-operator/1.log" Mar 13 01:18:08.482502 master-0 kubenswrapper[7110]: I0313 01:18:08.482466 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:18:08.514410 master-0 kubenswrapper[7110]: W0313 01:18:08.514366 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33dfdc31_54a4_4249_99ae_a15180514659.slice/crio-3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3 WatchSource:0}: Error finding container 3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3: Status 404 returned error can't find the container with id 3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3 Mar 13 01:18:08.562754 master-0 kubenswrapper[7110]: I0313 01:18:08.562717 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-zt229"] Mar 13 01:18:08.576296 master-0 kubenswrapper[7110]: W0313 01:18:08.576269 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c24e17c_8bd9_4c23_9876_6f31c9da5cd1.slice/crio-c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36 WatchSource:0}: Error finding container c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36: Status 404 returned error can't find the container with id c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36 Mar 13 01:18:08.670620 master-0 kubenswrapper[7110]: I0313 01:18:08.670530 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-bxqp2_f9b713fb-64ce-4a01-951c-1f31df62e1ae/authentication-operator/2.log" Mar 13 01:18:08.920258 master-0 kubenswrapper[7110]: I0313 01:18:08.920202 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fbbe97-906a-4bce-9ab0-bf633d4f9dd7" path="/var/lib/kubelet/pods/23fbbe97-906a-4bce-9ab0-bf633d4f9dd7/volumes" Mar 13 01:18:09.004036 master-0 kubenswrapper[7110]: I0313 01:18:09.003957 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"4c32d9e4412752c87d1ee4cd650cdb2cb23242fbddb597c5152f445f6b017bdc"} Mar 13 01:18:09.004036 master-0 kubenswrapper[7110]: I0313 01:18:09.004000 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3"} Mar 13 01:18:09.005940 master-0 kubenswrapper[7110]: I0313 01:18:09.005917 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"1848ae0dc4f507d3dbd623fb664c63190062c00de5b6fa781f847cd37341986e"} Mar 13 01:18:09.006002 master-0 kubenswrapper[7110]: I0313 01:18:09.005942 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36"} Mar 13 01:18:09.007671 master-0 kubenswrapper[7110]: I0313 01:18:09.007610 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"c75774122f86364cbb2037bd356bbbad5d4c638ff9b0ea91ed1c5645b4b0a5e2"} Mar 13 01:18:09.007730 master-0 kubenswrapper[7110]: I0313 01:18:09.007672 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"c8bfd0ac311b1657eee9f6e460d76c3b97867545a733bd30edc441dfb4a82394"} Mar 13 01:18:09.007730 master-0 kubenswrapper[7110]: I0313 01:18:09.007688 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"3ac90b7e141885c73870d9744a9126cb8648da1eed1822b13844b812ecb6dc82"} Mar 13 01:18:09.032774 master-0 kubenswrapper[7110]: I0313 01:18:09.032533 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" podStartSLOduration=3.032511512 podStartE2EDuration="3.032511512s" podCreationTimestamp="2026-03-13 01:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:09.027012808 +0000 UTC m=+290.312039274" watchObservedRunningTime="2026-03-13 01:18:09.032511512 +0000 UTC m=+290.317537978" Mar 13 01:18:09.069414 master-0 kubenswrapper[7110]: I0313 01:18:09.069377 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-78885b775b-jrrjv_57eb2020-1560-4352-8b86-76db59de933a/fix-audit-permissions/0.log" Mar 13 01:18:09.072771 master-0 kubenswrapper[7110]: I0313 01:18:09.072113 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:18:09.072771 master-0 kubenswrapper[7110]: I0313 01:18:09.072355 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" containerID="cri-o://cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41" gracePeriod=30 Mar 13 01:18:09.072771 master-0 kubenswrapper[7110]: I0313 01:18:09.072421 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" containerID="cri-o://2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e" gracePeriod=30 Mar 13 01:18:09.263056 master-0 kubenswrapper[7110]: I0313 01:18:09.262957 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-78885b775b-jrrjv_57eb2020-1560-4352-8b86-76db59de933a/oauth-apiserver/0.log" Mar 13 01:18:09.469745 master-0 kubenswrapper[7110]: I0313 01:18:09.469701 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-h4kkj_21cbea73-f779-43e4-b5ba-d6fa06275d34/etcd-operator/0.log" Mar 13 01:18:09.650293 master-0 kubenswrapper[7110]: I0313 01:18:09.648501 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 13 01:18:09.650293 master-0 kubenswrapper[7110]: I0313 01:18:09.649297 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.653645 master-0 kubenswrapper[7110]: I0313 01:18:09.653378 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 13 01:18:09.695388 master-0 kubenswrapper[7110]: I0313 01:18:09.694413 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 01:18:09.696285 master-0 kubenswrapper[7110]: I0313 01:18:09.696259 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k6d2j" Mar 13 01:18:09.718581 master-0 kubenswrapper[7110]: I0313 01:18:09.718493 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.718718 master-0 kubenswrapper[7110]: I0313 01:18:09.718599 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.718718 master-0 kubenswrapper[7110]: I0313 01:18:09.718706 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.718984 master-0 kubenswrapper[7110]: I0313 01:18:09.718952 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-h4kkj_21cbea73-f779-43e4-b5ba-d6fa06275d34/etcd-operator/1.log" Mar 13 01:18:09.824372 master-0 kubenswrapper[7110]: I0313 01:18:09.821714 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.824372 master-0 kubenswrapper[7110]: I0313 01:18:09.821813 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.824372 master-0 kubenswrapper[7110]: I0313 01:18:09.821901 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.824372 master-0 kubenswrapper[7110]: I0313 01:18:09.822100 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.824372 master-0 kubenswrapper[7110]: I0313 01:18:09.822151 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.842439 master-0 kubenswrapper[7110]: I0313 01:18:09.842387 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:09.868341 master-0 kubenswrapper[7110]: I0313 01:18:09.868288 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/setup/0.log" Mar 13 01:18:09.951413 master-0 kubenswrapper[7110]: I0313 01:18:09.951117 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:18:09.951413 master-0 kubenswrapper[7110]: I0313 01:18:09.951246 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:18:09.981160 master-0 kubenswrapper[7110]: I0313 01:18:09.981070 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:18:09.981160 master-0 kubenswrapper[7110]: I0313 01:18:09.981119 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:18:10.018611 master-0 kubenswrapper[7110]: I0313 01:18:10.018544 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"d4128612049d2903866c89ea3ac616fb89c5c7677c3ff52ca9d870714f95087e"} Mar 13 01:18:10.020575 master-0 kubenswrapper[7110]: I0313 01:18:10.020364 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:18:10.021868 master-0 kubenswrapper[7110]: I0313 01:18:10.020589 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:18:10.022548 master-0 kubenswrapper[7110]: I0313 01:18:10.022432 7110 generic.go:334] "Generic (PLEG): container finished" podID="2bd94289-7109-4419-9a51-bd289082b9f5" containerID="2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e" exitCode=0 Mar 13 01:18:10.024338 master-0 kubenswrapper[7110]: I0313 01:18:10.023081 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerDied","Data":"2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e"} Mar 13 01:18:10.025807 master-0 kubenswrapper[7110]: I0313 01:18:10.025684 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:10.028883 master-0 kubenswrapper[7110]: I0313 01:18:10.028528 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:18:10.053132 master-0 kubenswrapper[7110]: I0313 01:18:10.050000 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" podStartSLOduration=2.049978364 podStartE2EDuration="2.049978364s" podCreationTimestamp="2026-03-13 01:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:10.040689501 +0000 UTC m=+291.325715987" watchObservedRunningTime="2026-03-13 01:18:10.049978364 +0000 UTC m=+291.335004830" Mar 13 01:18:10.065065 master-0 kubenswrapper[7110]: I0313 01:18:10.064788 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-ensure-env-vars/0.log" Mar 13 01:18:10.080798 master-0 kubenswrapper[7110]: I0313 01:18:10.080528 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:18:10.087031 master-0 kubenswrapper[7110]: I0313 01:18:10.086989 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:18:10.263134 master-0 kubenswrapper[7110]: I0313 01:18:10.263030 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-resources-copy/0.log" Mar 13 01:18:10.466925 master-0 kubenswrapper[7110]: I0313 01:18:10.466884 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 13 01:18:10.672511 master-0 kubenswrapper[7110]: I0313 01:18:10.672457 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 13 01:18:10.864011 master-0 kubenswrapper[7110]: I0313 01:18:10.863966 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 01:18:10.996876 master-0 kubenswrapper[7110]: I0313 01:18:10.996736 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k52lh" podUID="3f9728b4-4e1e-4165-a276-3daa00e95839" containerName="registry-server" probeResult="failure" output=< Mar 13 01:18:10.996876 master-0 kubenswrapper[7110]: timeout: failed to connect service ":50051" within 1s Mar 13 01:18:10.996876 master-0 kubenswrapper[7110]: > Mar 13 01:18:11.095652 master-0 kubenswrapper[7110]: I0313 01:18:11.089281 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-readyz/0.log" Mar 13 01:18:11.104898 master-0 kubenswrapper[7110]: I0313 01:18:11.104544 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:18:11.266729 master-0 kubenswrapper[7110]: I0313 01:18:11.265899 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 01:18:11.295278 master-0 kubenswrapper[7110]: I0313 01:18:11.293493 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh"] Mar 13 01:18:11.295278 master-0 kubenswrapper[7110]: I0313 01:18:11.295022 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.295278 master-0 kubenswrapper[7110]: I0313 01:18:11.298971 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-ps7fb" Mar 13 01:18:11.295278 master-0 kubenswrapper[7110]: I0313 01:18:11.299168 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 01:18:11.320419 master-0 kubenswrapper[7110]: I0313 01:18:11.320202 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh"] Mar 13 01:18:11.348834 master-0 kubenswrapper[7110]: I0313 01:18:11.348725 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.348834 master-0 kubenswrapper[7110]: I0313 01:18:11.348831 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.349097 master-0 kubenswrapper[7110]: I0313 01:18:11.348922 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nfk\" (UniqueName: \"kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.449806 master-0 kubenswrapper[7110]: I0313 01:18:11.449705 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.449993 master-0 kubenswrapper[7110]: I0313 01:18:11.449913 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nfk\" (UniqueName: \"kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.450722 master-0 kubenswrapper[7110]: I0313 01:18:11.450676 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.451240 master-0 kubenswrapper[7110]: I0313 01:18:11.451196 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.467479 master-0 kubenswrapper[7110]: I0313 01:18:11.467453 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.470428 master-0 kubenswrapper[7110]: I0313 01:18:11.470397 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nfk\" (UniqueName: \"kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.472344 master-0 kubenswrapper[7110]: I0313 01:18:11.472317 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_36b2d6ee-3ae7-444b-b327-f024a8a06ab7/installer/0.log" Mar 13 01:18:11.624851 master-0 kubenswrapper[7110]: I0313 01:18:11.623947 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:18:11.665623 master-0 kubenswrapper[7110]: I0313 01:18:11.665567 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-mlslx_b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/kube-apiserver-operator/0.log" Mar 13 01:18:11.866799 master-0 kubenswrapper[7110]: I0313 01:18:11.866762 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-mlslx_b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/kube-apiserver-operator/1.log" Mar 13 01:18:11.984384 master-0 kubenswrapper[7110]: I0313 01:18:11.984338 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 13 01:18:11.992931 master-0 kubenswrapper[7110]: W0313 01:18:11.992859 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod690f916b_6f87_42d9_8168_392a9177bee9.slice/crio-e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9 WatchSource:0}: Error finding container e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9: Status 404 returned error can't find the container with id e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9 Mar 13 01:18:12.029389 master-0 kubenswrapper[7110]: I0313 01:18:12.029345 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh"] Mar 13 01:18:12.036273 master-0 kubenswrapper[7110]: W0313 01:18:12.036237 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93871019_3d0c_4081_9afe_19b6dd108ec6.slice/crio-4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c WatchSource:0}: Error finding container 4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c: Status 404 returned error can't find the container with id 4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c Mar 13 01:18:12.037570 master-0 kubenswrapper[7110]: I0313 01:18:12.037525 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerStarted","Data":"904f10ba4b3a84cf1955d48de30060073ce2c5688519de186736725302931d40"} Mar 13 01:18:12.039165 master-0 kubenswrapper[7110]: I0313 01:18:12.039129 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerStarted","Data":"e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9"} Mar 13 01:18:12.063921 master-0 kubenswrapper[7110]: I0313 01:18:12.063885 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/setup/0.log" Mar 13 01:18:12.264805 master-0 kubenswrapper[7110]: I0313 01:18:12.264712 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/kube-apiserver/0.log" Mar 13 01:18:12.279424 master-0 kubenswrapper[7110]: I0313 01:18:12.279375 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg"] Mar 13 01:18:12.280219 master-0 kubenswrapper[7110]: I0313 01:18:12.280182 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:12.285721 master-0 kubenswrapper[7110]: I0313 01:18:12.284550 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 01:18:12.285721 master-0 kubenswrapper[7110]: I0313 01:18:12.284763 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h"] Mar 13 01:18:12.285721 master-0 kubenswrapper[7110]: I0313 01:18:12.285563 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:18:12.292104 master-0 kubenswrapper[7110]: I0313 01:18:12.291124 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-cnrhm"] Mar 13 01:18:12.292104 master-0 kubenswrapper[7110]: I0313 01:18:12.291936 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.298142 master-0 kubenswrapper[7110]: I0313 01:18:12.295167 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-hdx2d"] Mar 13 01:18:12.298142 master-0 kubenswrapper[7110]: I0313 01:18:12.298039 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 01:18:12.298289 master-0 kubenswrapper[7110]: I0313 01:18:12.298278 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 01:18:12.298663 master-0 kubenswrapper[7110]: I0313 01:18:12.298384 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 01:18:12.298663 master-0 kubenswrapper[7110]: I0313 01:18:12.298498 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 01:18:12.298663 master-0 kubenswrapper[7110]: I0313 01:18:12.298603 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 01:18:12.299520 master-0 kubenswrapper[7110]: I0313 01:18:12.298819 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 01:18:12.299520 master-0 kubenswrapper[7110]: I0313 01:18:12.299081 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg"] Mar 13 01:18:12.299520 master-0 kubenswrapper[7110]: I0313 01:18:12.299185 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.309910 master-0 kubenswrapper[7110]: I0313 01:18:12.309838 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h"] Mar 13 01:18:12.322450 master-0 kubenswrapper[7110]: I0313 01:18:12.322408 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jf79b" Mar 13 01:18:12.322645 master-0 kubenswrapper[7110]: I0313 01:18:12.322612 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 13 01:18:12.376898 master-0 kubenswrapper[7110]: I0313 01:18:12.376839 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.376898 master-0 kubenswrapper[7110]: I0313 01:18:12.376880 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.376898 master-0 kubenswrapper[7110]: I0313 01:18:12.376898 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mn4\" (UniqueName: \"kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4\") pod \"network-check-source-7c67b67d47-5fv6h\" (UID: \"48375ae2-d4b4-4db4-b832-3e3db1834fb9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:18:12.377142 master-0 kubenswrapper[7110]: I0313 01:18:12.376926 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.377142 master-0 kubenswrapper[7110]: I0313 01:18:12.376944 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.377142 master-0 kubenswrapper[7110]: I0313 01:18:12.376960 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftn5x\" (UniqueName: \"kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.377142 master-0 kubenswrapper[7110]: I0313 01:18:12.376978 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.377142 master-0 kubenswrapper[7110]: I0313 01:18:12.376996 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5hs\" (UniqueName: \"kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.377283 master-0 kubenswrapper[7110]: I0313 01:18:12.377147 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.377315 master-0 kubenswrapper[7110]: I0313 01:18:12.377182 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.377504 master-0 kubenswrapper[7110]: I0313 01:18:12.377472 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:12.478568 master-0 kubenswrapper[7110]: I0313 01:18:12.478505 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftn5x\" (UniqueName: \"kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.478568 master-0 kubenswrapper[7110]: I0313 01:18:12.478550 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.478568 master-0 kubenswrapper[7110]: I0313 01:18:12.478575 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478593 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5hs\" (UniqueName: \"kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478627 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478660 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478697 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478743 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478758 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478778 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mn4\" (UniqueName: \"kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4\") pod \"network-check-source-7c67b67d47-5fv6h\" (UID: \"48375ae2-d4b4-4db4-b832-3e3db1834fb9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:18:12.478826 master-0 kubenswrapper[7110]: I0313 01:18:12.478796 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.479688 master-0 kubenswrapper[7110]: I0313 01:18:12.479658 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.480080 master-0 kubenswrapper[7110]: I0313 01:18:12.480038 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.480820 master-0 kubenswrapper[7110]: I0313 01:18:12.480787 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.481024 master-0 kubenswrapper[7110]: I0313 01:18:12.480994 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.481857 master-0 kubenswrapper[7110]: I0313 01:18:12.481817 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.482541 master-0 kubenswrapper[7110]: I0313 01:18:12.482506 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.482783 master-0 kubenswrapper[7110]: I0313 01:18:12.482753 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:12.484980 master-0 kubenswrapper[7110]: I0313 01:18:12.484953 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.643053 master-0 kubenswrapper[7110]: I0313 01:18:12.642935 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:12.655366 master-0 kubenswrapper[7110]: I0313 01:18:12.655326 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_5f77c8e18b751d90bc0dfe2d4e304050/kube-apiserver-insecure-readyz/0.log" Mar 13 01:18:12.668301 master-0 kubenswrapper[7110]: I0313 01:18:12.668255 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_47806631-9d60-4658-832d-f160f93f42ea/installer/0.log" Mar 13 01:18:12.671354 master-0 kubenswrapper[7110]: I0313 01:18:12.671289 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mn4\" (UniqueName: \"kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4\") pod \"network-check-source-7c67b67d47-5fv6h\" (UID: \"48375ae2-d4b4-4db4-b832-3e3db1834fb9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:18:12.672158 master-0 kubenswrapper[7110]: I0313 01:18:12.672139 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5hs\" (UniqueName: \"kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.679402 master-0 kubenswrapper[7110]: I0313 01:18:12.679321 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftn5x\" (UniqueName: \"kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:12.689178 master-0 kubenswrapper[7110]: I0313 01:18:12.689109 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:12.717462 master-0 kubenswrapper[7110]: W0313 01:18:12.717382 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2a4c90_32db_4464_8c47_acbcafbcd5d0.slice/crio-8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284 WatchSource:0}: Error finding container 8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284: Status 404 returned error can't find the container with id 8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284 Mar 13 01:18:12.880152 master-0 kubenswrapper[7110]: I0313 01:18:12.880112 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_29e096ea-ca9d-477b-b0aa-1d10244d51d9/installer/0.log" Mar 13 01:18:12.966392 master-0 kubenswrapper[7110]: I0313 01:18:12.963865 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:18:12.977300 master-0 kubenswrapper[7110]: I0313 01:18:12.977257 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:13.002832 master-0 kubenswrapper[7110]: W0313 01:18:13.002781 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0671fdd0_b358_40f9_ae49_2c5a9004edb3.slice/crio-7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679 WatchSource:0}: Error finding container 7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679: Status 404 returned error can't find the container with id 7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679 Mar 13 01:18:13.053096 master-0 kubenswrapper[7110]: I0313 01:18:13.052815 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerStarted","Data":"7d1b306a00e2f892ec3846df537645839763a85b452540677bdd0f381e63d686"} Mar 13 01:18:13.053096 master-0 kubenswrapper[7110]: I0313 01:18:13.052870 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerStarted","Data":"a2001794f73a162fbfc6dea653fb7764cf71881334967818dd15d887d2c42880"} Mar 13 01:18:13.054023 master-0 kubenswrapper[7110]: I0313 01:18:13.053974 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" event={"ID":"0671fdd0-b358-40f9-ae49-2c5a9004edb3","Type":"ContainerStarted","Data":"7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679"} Mar 13 01:18:13.055726 master-0 kubenswrapper[7110]: I0313 01:18:13.055683 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerStarted","Data":"f3c19acecbccf7bd6716e7d44a9b0fc9bb63ca007ca5d04b416b934ef2cbe52c"} Mar 13 01:18:13.057184 master-0 kubenswrapper[7110]: I0313 01:18:13.056888 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" event={"ID":"ac2a4c90-32db-4464-8c47-acbcafbcd5d0","Type":"ContainerStarted","Data":"8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284"} Mar 13 01:18:13.059445 master-0 kubenswrapper[7110]: I0313 01:18:13.059404 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"fd1d18f6baa95b22bed3d37f8927776b5c5d98b2e99e7637fd5820559ef6427b"} Mar 13 01:18:13.059534 master-0 kubenswrapper[7110]: I0313 01:18:13.059449 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"3a10917547442a55f0af92439b344052e9b73ef7d7bdf470aada2ad5959830bc"} Mar 13 01:18:13.059534 master-0 kubenswrapper[7110]: I0313 01:18:13.059460 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c"} Mar 13 01:18:13.612871 master-0 kubenswrapper[7110]: I0313 01:18:13.609587 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" podStartSLOduration=3.336999167 podStartE2EDuration="7.609562848s" podCreationTimestamp="2026-03-13 01:18:06 +0000 UTC" firstStartedPulling="2026-03-13 01:18:07.305706627 +0000 UTC m=+288.590733093" lastFinishedPulling="2026-03-13 01:18:11.578270308 +0000 UTC m=+292.863296774" observedRunningTime="2026-03-13 01:18:13.6073459 +0000 UTC m=+294.892372366" watchObservedRunningTime="2026-03-13 01:18:13.609562848 +0000 UTC m=+294.894589354" Mar 13 01:18:13.618863 master-0 kubenswrapper[7110]: I0313 01:18:13.618480 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg"] Mar 13 01:18:13.691891 master-0 kubenswrapper[7110]: I0313 01:18:13.691807 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-retry-1-master-0_728435e4-9fdb-4fea-9f5b-eb5ff5444da0/installer/0.log" Mar 13 01:18:14.060134 master-0 kubenswrapper[7110]: I0313 01:18:14.060083 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h"] Mar 13 01:18:14.083456 master-0 kubenswrapper[7110]: I0313 01:18:14.083382 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=5.083363991 podStartE2EDuration="5.083363991s" podCreationTimestamp="2026-03-13 01:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:14.081611975 +0000 UTC m=+295.366638461" watchObservedRunningTime="2026-03-13 01:18:14.083363991 +0000 UTC m=+295.368390457" Mar 13 01:18:14.095276 master-0 kubenswrapper[7110]: I0313 01:18:14.094196 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-v9pv6_58035e42-37d8-48f6-9861-9b4ce6014119/kube-controller-manager-operator/0.log" Mar 13 01:18:14.102664 master-0 kubenswrapper[7110]: I0313 01:18:14.101481 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" event={"ID":"ac2a4c90-32db-4464-8c47-acbcafbcd5d0","Type":"ContainerStarted","Data":"3c6ff0fe1111a981e0f82680b651025845befe1e63f28e04da68d060f2f82f77"} Mar 13 01:18:14.102664 master-0 kubenswrapper[7110]: I0313 01:18:14.102456 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:14.113011 master-0 kubenswrapper[7110]: I0313 01:18:14.109358 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-v9pv6_58035e42-37d8-48f6-9861-9b4ce6014119/kube-controller-manager-operator/1.log" Mar 13 01:18:14.113011 master-0 kubenswrapper[7110]: I0313 01:18:14.110333 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" event={"ID":"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7","Type":"ContainerStarted","Data":"5f0a23a29ec1be227442f950c7b43af141e31a2152ab46cc286a5229950b1bae"} Mar 13 01:18:14.127781 master-0 kubenswrapper[7110]: I0313 01:18:14.125505 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_f78c05e1499b533b83f091333d61f045/kube-controller-manager/3.log" Mar 13 01:18:14.160689 master-0 kubenswrapper[7110]: I0313 01:18:14.154258 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" podStartSLOduration=2.154235695 podStartE2EDuration="2.154235695s" podCreationTimestamp="2026-03-13 01:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:14.151514684 +0000 UTC m=+295.436541150" watchObservedRunningTime="2026-03-13 01:18:14.154235695 +0000 UTC m=+295.439262161" Mar 13 01:18:14.160689 master-0 kubenswrapper[7110]: I0313 01:18:14.154351 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" podStartSLOduration=3.154345968 podStartE2EDuration="3.154345968s" podCreationTimestamp="2026-03-13 01:18:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:14.118775407 +0000 UTC m=+295.403801873" watchObservedRunningTime="2026-03-13 01:18:14.154345968 +0000 UTC m=+295.439372434" Mar 13 01:18:14.161124 master-0 kubenswrapper[7110]: I0313 01:18:14.161083 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_f78c05e1499b533b83f091333d61f045/kube-controller-manager/4.log" Mar 13 01:18:14.191598 master-0 kubenswrapper[7110]: I0313 01:18:14.186581 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:18:14.264733 master-0 kubenswrapper[7110]: I0313 01:18:14.264689 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_f78c05e1499b533b83f091333d61f045/cluster-policy-controller/0.log" Mar 13 01:18:14.327532 master-0 kubenswrapper[7110]: I0313 01:18:14.325789 7110 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:18:14.475690 master-0 kubenswrapper[7110]: I0313 01:18:14.475613 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_a1a56802af72ce1aac6b5077f1695ac0/kube-scheduler/0.log" Mar 13 01:18:14.668479 master-0 kubenswrapper[7110]: I0313 01:18:14.667963 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_a1a56802af72ce1aac6b5077f1695ac0/kube-scheduler/1.log" Mar 13 01:18:14.863793 master-0 kubenswrapper[7110]: I0313 01:18:14.863738 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0b3a64f4-e94f-4916-8c91-a255d987735d/installer/0.log" Mar 13 01:18:15.064775 master-0 kubenswrapper[7110]: I0313 01:18:15.063570 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_revision-pruner-6-master-0_9c9c1c81-eae9-4481-9870-b598deb1dcac/pruner/0.log" Mar 13 01:18:15.117832 master-0 kubenswrapper[7110]: I0313 01:18:15.117784 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" event={"ID":"48375ae2-d4b4-4db4-b832-3e3db1834fb9","Type":"ContainerStarted","Data":"413763d6750693e98f10ee26b7d5a65b72db3afb864d04cf8231231c2007edea"} Mar 13 01:18:15.117974 master-0 kubenswrapper[7110]: I0313 01:18:15.117841 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" event={"ID":"48375ae2-d4b4-4db4-b832-3e3db1834fb9","Type":"ContainerStarted","Data":"19ce38027ae9e3d0076b8c83191fabde1e4e81b393c760835578ba3bc36b41b2"} Mar 13 01:18:15.140800 master-0 kubenswrapper[7110]: I0313 01:18:15.140679 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" podStartSLOduration=352.140650865 podStartE2EDuration="5m52.140650865s" podCreationTimestamp="2026-03-13 01:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:15.135183832 +0000 UTC m=+296.420210298" watchObservedRunningTime="2026-03-13 01:18:15.140650865 +0000 UTC m=+296.425677341" Mar 13 01:18:15.268841 master-0 kubenswrapper[7110]: I0313 01:18:15.268777 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-wbmqn_22587300-2448-4862-9fd8-68197d17a9f2/kube-scheduler-operator-container/0.log" Mar 13 01:18:15.462999 master-0 kubenswrapper[7110]: I0313 01:18:15.462952 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-wbmqn_22587300-2448-4862-9fd8-68197d17a9f2/kube-scheduler-operator-container/1.log" Mar 13 01:18:15.888377 master-0 kubenswrapper[7110]: I0313 01:18:15.888257 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/egress-router-binary-copy/0.log" Mar 13 01:18:15.897506 master-0 kubenswrapper[7110]: I0313 01:18:15.896602 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/cni-plugins/0.log" Mar 13 01:18:16.426407 master-0 kubenswrapper[7110]: I0313 01:18:16.424735 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/bond-cni-plugin/0.log" Mar 13 01:18:16.723652 master-0 kubenswrapper[7110]: I0313 01:18:16.723066 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/routeoverride-cni/0.log" Mar 13 01:18:16.769611 master-0 kubenswrapper[7110]: I0313 01:18:16.769524 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/whereabouts-cni-bincopy/0.log" Mar 13 01:18:17.107832 master-0 kubenswrapper[7110]: I0313 01:18:17.107072 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/whereabouts-cni/0.log" Mar 13 01:18:17.117462 master-0 kubenswrapper[7110]: I0313 01:18:17.117397 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xn5t5_4738c93d-62e6-44ce-a289-e646b9302e71/kube-multus-additional-cni-plugins/0.log" Mar 13 01:18:17.124560 master-0 kubenswrapper[7110]: I0313 01:18:17.124535 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-tq7n6_2bd94289-7109-4419-9a51-bd289082b9f5/multus-admission-controller/0.log" Mar 13 01:18:17.203657 master-0 kubenswrapper[7110]: I0313 01:18:17.200459 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4gpcz"] Mar 13 01:18:17.203657 master-0 kubenswrapper[7110]: I0313 01:18:17.201297 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.203657 master-0 kubenswrapper[7110]: I0313 01:18:17.202986 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qdr99" Mar 13 01:18:17.204055 master-0 kubenswrapper[7110]: I0313 01:18:17.203927 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 01:18:17.204103 master-0 kubenswrapper[7110]: I0313 01:18:17.204036 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 01:18:17.263447 master-0 kubenswrapper[7110]: I0313 01:18:17.263390 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-tq7n6_2bd94289-7109-4419-9a51-bd289082b9f5/kube-rbac-proxy/0.log" Mar 13 01:18:17.290166 master-0 kubenswrapper[7110]: I0313 01:18:17.290124 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.290322 master-0 kubenswrapper[7110]: I0313 01:18:17.290170 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.290322 master-0 kubenswrapper[7110]: I0313 01:18:17.290266 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5gmv\" (UniqueName: \"kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.391261 master-0 kubenswrapper[7110]: I0313 01:18:17.391138 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gmv\" (UniqueName: \"kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.391261 master-0 kubenswrapper[7110]: I0313 01:18:17.391213 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.391261 master-0 kubenswrapper[7110]: I0313 01:18:17.391237 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.397702 master-0 kubenswrapper[7110]: I0313 01:18:17.397415 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.398393 master-0 kubenswrapper[7110]: I0313 01:18:17.398371 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.413146 master-0 kubenswrapper[7110]: I0313 01:18:17.413095 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gmv\" (UniqueName: \"kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.464765 master-0 kubenswrapper[7110]: I0313 01:18:17.464209 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rvt5h_2937cbe2-3125-4c3f-96f8-2febeb5942cc/kube-multus/0.log" Mar 13 01:18:17.518672 master-0 kubenswrapper[7110]: I0313 01:18:17.518616 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:18:17.663683 master-0 kubenswrapper[7110]: I0313 01:18:17.663651 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zh5fh_e68ab3cb-c372-45d9-a758-beaf4c213714/network-metrics-daemon/0.log" Mar 13 01:18:17.861567 master-0 kubenswrapper[7110]: I0313 01:18:17.861506 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zh5fh_e68ab3cb-c372-45d9-a758-beaf4c213714/kube-rbac-proxy/0.log" Mar 13 01:18:18.269602 master-0 kubenswrapper[7110]: I0313 01:18:18.269516 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-66b55d57d-cjmvd_1308fba1-a50d-48b3-b272-7bef44727b7f/ovnkube-cluster-manager/0.log" Mar 13 01:18:18.464610 master-0 kubenswrapper[7110]: I0313 01:18:18.464535 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-66b55d57d-cjmvd_1308fba1-a50d-48b3-b272-7bef44727b7f/kube-rbac-proxy/0.log" Mar 13 01:18:18.671183 master-0 kubenswrapper[7110]: I0313 01:18:18.671130 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-control-plane-66b55d57d-cjmvd_1308fba1-a50d-48b3-b272-7bef44727b7f/ovnkube-cluster-manager/1.log" Mar 13 01:18:18.868421 master-0 kubenswrapper[7110]: I0313 01:18:18.868374 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/kubecfg-setup/0.log" Mar 13 01:18:19.062563 master-0 kubenswrapper[7110]: I0313 01:18:19.062452 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/ovn-controller/0.log" Mar 13 01:18:19.106320 master-0 kubenswrapper[7110]: I0313 01:18:19.106247 7110 scope.go:117] "RemoveContainer" containerID="2d1ba7ec4846defd3b04a175ca5a3b9796ffce1a2ede0d1ea47e737fb6974a90" Mar 13 01:18:19.264842 master-0 kubenswrapper[7110]: I0313 01:18:19.263990 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/ovn-acl-logging/0.log" Mar 13 01:18:19.465395 master-0 kubenswrapper[7110]: I0313 01:18:19.464713 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/kube-rbac-proxy-node/0.log" Mar 13 01:18:19.664285 master-0 kubenswrapper[7110]: I0313 01:18:19.662484 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/kube-rbac-proxy-ovn-metrics/0.log" Mar 13 01:18:19.864658 master-0 kubenswrapper[7110]: I0313 01:18:19.864586 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/northd/0.log" Mar 13 01:18:20.013446 master-0 kubenswrapper[7110]: W0313 01:18:20.013399 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedda0d03_fdb2_4130_8f73_8057efd5815c.slice/crio-df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498 WatchSource:0}: Error finding container df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498: Status 404 returned error can't find the container with id df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498 Mar 13 01:18:20.018264 master-0 kubenswrapper[7110]: I0313 01:18:20.018226 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:18:20.075762 master-0 kubenswrapper[7110]: I0313 01:18:20.075715 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:18:20.173821 master-0 kubenswrapper[7110]: I0313 01:18:20.173733 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4gpcz" event={"ID":"edda0d03-fdb2-4130-8f73-8057efd5815c","Type":"ContainerStarted","Data":"df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498"} Mar 13 01:18:21.182534 master-0 kubenswrapper[7110]: I0313 01:18:21.182464 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" event={"ID":"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7","Type":"ContainerStarted","Data":"a0beff0548dc89e3eddfbc4d73bdac22ebf75fa5e75296d06029066e2708e943"} Mar 13 01:18:21.184678 master-0 kubenswrapper[7110]: I0313 01:18:21.184612 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4gpcz" event={"ID":"edda0d03-fdb2-4130-8f73-8057efd5815c","Type":"ContainerStarted","Data":"581f4518b2025f29d1124b712492b62d765adabd9251be91282ecbf98e44533f"} Mar 13 01:18:21.749223 master-0 kubenswrapper[7110]: I0313 01:18:21.749180 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/nbdb/0.log" Mar 13 01:18:21.764705 master-0 kubenswrapper[7110]: I0313 01:18:21.762703 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/sbdb/0.log" Mar 13 01:18:21.773410 master-0 kubenswrapper[7110]: I0313 01:18:21.773380 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v56ct_4edb3e1a-9082-4fc2-ae6f-99d49c078a34/ovnkube-controller/0.log" Mar 13 01:18:21.784758 master-0 kubenswrapper[7110]: I0313 01:18:21.784718 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7c67b67d47-5fv6h_48375ae2-d4b4-4db4-b832-3e3db1834fb9/check-endpoints/0.log" Mar 13 01:18:21.806789 master-0 kubenswrapper[7110]: I0313 01:18:21.806749 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xs8pt_d5456c8b-3c98-4824-8700-a04e9c12fb2e/network-check-target-container/0.log" Mar 13 01:18:21.825556 master-0 kubenswrapper[7110]: I0313 01:18:21.825512 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/0.log" Mar 13 01:18:21.839733 master-0 kubenswrapper[7110]: I0313 01:18:21.839674 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/webhook/0.log" Mar 13 01:18:21.851602 master-0 kubenswrapper[7110]: I0313 01:18:21.851567 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/1.log" Mar 13 01:18:21.863797 master-0 kubenswrapper[7110]: I0313 01:18:21.863754 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qclwv_46662e51-44af-4732-83a1-9509a579b373/iptables-alerter/0.log" Mar 13 01:18:22.067039 master-0 kubenswrapper[7110]: I0313 01:18:22.066919 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-bdc4j_bfc49699-9428-4bff-804d-da0e60551759/network-operator/0.log" Mar 13 01:18:22.192795 master-0 kubenswrapper[7110]: I0313 01:18:22.192721 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" event={"ID":"0671fdd0-b358-40f9-ae49-2c5a9004edb3","Type":"ContainerStarted","Data":"1e0fcce31d2e2166ce78dfb55bc928ed4de8e1614fc84db1f62c529d76a3c284"} Mar 13 01:18:22.194265 master-0 kubenswrapper[7110]: I0313 01:18:22.194227 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"9920e6a0f414f5524ee15784a68a40cecc25e4656d0ff55f7b9dbec55000a82e"} Mar 13 01:18:22.232752 master-0 kubenswrapper[7110]: I0313 01:18:22.232678 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podStartSLOduration=254.21576334 podStartE2EDuration="4m21.232613859s" podCreationTimestamp="2026-03-13 01:14:01 +0000 UTC" firstStartedPulling="2026-03-13 01:18:13.005047857 +0000 UTC m=+294.290074313" lastFinishedPulling="2026-03-13 01:18:20.021898366 +0000 UTC m=+301.306924832" observedRunningTime="2026-03-13 01:18:22.230588786 +0000 UTC m=+303.515615252" watchObservedRunningTime="2026-03-13 01:18:22.232613859 +0000 UTC m=+303.517640335" Mar 13 01:18:22.277801 master-0 kubenswrapper[7110]: I0313 01:18:22.277719 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-bdc4j_bfc49699-9428-4bff-804d-da0e60551759/network-operator/1.log" Mar 13 01:18:22.300968 master-0 kubenswrapper[7110]: I0313 01:18:22.300873 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" podStartSLOduration=4.005141708 podStartE2EDuration="15.300852574s" podCreationTimestamp="2026-03-13 01:18:07 +0000 UTC" firstStartedPulling="2026-03-13 01:18:08.715658445 +0000 UTC m=+290.000684911" lastFinishedPulling="2026-03-13 01:18:20.011369311 +0000 UTC m=+301.296395777" observedRunningTime="2026-03-13 01:18:22.25866474 +0000 UTC m=+303.543691216" watchObservedRunningTime="2026-03-13 01:18:22.300852574 +0000 UTC m=+303.585879050" Mar 13 01:18:22.301506 master-0 kubenswrapper[7110]: I0313 01:18:22.301457 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" podStartSLOduration=219.914413954 podStartE2EDuration="3m46.30145158s" podCreationTimestamp="2026-03-13 01:14:36 +0000 UTC" firstStartedPulling="2026-03-13 01:18:13.636340389 +0000 UTC m=+294.921366855" lastFinishedPulling="2026-03-13 01:18:20.023378015 +0000 UTC m=+301.308404481" observedRunningTime="2026-03-13 01:18:22.298168924 +0000 UTC m=+303.583195390" watchObservedRunningTime="2026-03-13 01:18:22.30145158 +0000 UTC m=+303.586478056" Mar 13 01:18:22.320562 master-0 kubenswrapper[7110]: I0313 01:18:22.320309 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4gpcz" podStartSLOduration=5.320285802 podStartE2EDuration="5.320285802s" podCreationTimestamp="2026-03-13 01:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:22.317075708 +0000 UTC m=+303.602102184" watchObservedRunningTime="2026-03-13 01:18:22.320285802 +0000 UTC m=+303.605312258" Mar 13 01:18:22.467785 master-0 kubenswrapper[7110]: I0313 01:18:22.467658 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-mvmt2_486c7e33-3dd8-4a98-87e3-8216ee2e05ef/openshift-apiserver-operator/0.log" Mar 13 01:18:22.644827 master-0 kubenswrapper[7110]: I0313 01:18:22.644680 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:22.651866 master-0 kubenswrapper[7110]: I0313 01:18:22.651807 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:18:22.666800 master-0 kubenswrapper[7110]: I0313 01:18:22.666755 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-mvmt2_486c7e33-3dd8-4a98-87e3-8216ee2e05ef/openshift-apiserver-operator/1.log" Mar 13 01:18:22.861251 master-0 kubenswrapper[7110]: I0313 01:18:22.861193 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-69c74d8d69-jpj8z_738ebdcd-b78b-495a-b8f2-84af11a7d35c/fix-audit-permissions/0.log" Mar 13 01:18:22.977982 master-0 kubenswrapper[7110]: I0313 01:18:22.977888 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:22.977982 master-0 kubenswrapper[7110]: I0313 01:18:22.977936 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:18:22.980858 master-0 kubenswrapper[7110]: I0313 01:18:22.980798 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:22.980858 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:22.980858 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:22.980858 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:22.981054 master-0 kubenswrapper[7110]: I0313 01:18:22.980893 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:23.068106 master-0 kubenswrapper[7110]: I0313 01:18:23.068046 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-69c74d8d69-jpj8z_738ebdcd-b78b-495a-b8f2-84af11a7d35c/openshift-apiserver/0.log" Mar 13 01:18:23.275480 master-0 kubenswrapper[7110]: I0313 01:18:23.275321 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-69c74d8d69-jpj8z_738ebdcd-b78b-495a-b8f2-84af11a7d35c/openshift-apiserver-check-endpoints/0.log" Mar 13 01:18:23.466537 master-0 kubenswrapper[7110]: I0313 01:18:23.466465 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-h4kkj_21cbea73-f779-43e4-b5ba-d6fa06275d34/etcd-operator/0.log" Mar 13 01:18:23.603863 master-0 kubenswrapper[7110]: I0313 01:18:23.603696 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b"] Mar 13 01:18:23.604564 master-0 kubenswrapper[7110]: I0313 01:18:23.604534 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.607188 master-0 kubenswrapper[7110]: I0313 01:18:23.607150 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vdkmw" Mar 13 01:18:23.607301 master-0 kubenswrapper[7110]: I0313 01:18:23.607183 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 01:18:23.607493 master-0 kubenswrapper[7110]: I0313 01:18:23.607464 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 01:18:23.607582 master-0 kubenswrapper[7110]: I0313 01:18:23.607543 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 01:18:23.633468 master-0 kubenswrapper[7110]: I0313 01:18:23.633420 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b"] Mar 13 01:18:23.670042 master-0 kubenswrapper[7110]: I0313 01:18:23.669967 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-h4kkj_21cbea73-f779-43e4-b5ba-d6fa06275d34/etcd-operator/1.log" Mar 13 01:18:23.724711 master-0 kubenswrapper[7110]: I0313 01:18:23.724658 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.724711 master-0 kubenswrapper[7110]: I0313 01:18:23.724714 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.724948 master-0 kubenswrapper[7110]: I0313 01:18:23.724864 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.725073 master-0 kubenswrapper[7110]: I0313 01:18:23.725042 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrb2\" (UniqueName: \"kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.826538 master-0 kubenswrapper[7110]: I0313 01:18:23.826477 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.826538 master-0 kubenswrapper[7110]: I0313 01:18:23.826538 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.826871 master-0 kubenswrapper[7110]: I0313 01:18:23.826718 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.826907 master-0 kubenswrapper[7110]: I0313 01:18:23.826889 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrb2\" (UniqueName: \"kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.829773 master-0 kubenswrapper[7110]: I0313 01:18:23.827558 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.829773 master-0 kubenswrapper[7110]: I0313 01:18:23.829749 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.833644 master-0 kubenswrapper[7110]: I0313 01:18:23.832228 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.852032 master-0 kubenswrapper[7110]: I0313 01:18:23.851986 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrb2\" (UniqueName: \"kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.865308 master-0 kubenswrapper[7110]: I0313 01:18:23.865230 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-h46pz_7f35cc1e-3376-4dbd-b215-2a32bf62cc71/catalog-operator/0.log" Mar 13 01:18:23.924623 master-0 kubenswrapper[7110]: I0313 01:18:23.924568 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:18:23.980139 master-0 kubenswrapper[7110]: I0313 01:18:23.980095 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:23.980139 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:23.980139 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:23.980139 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:23.980386 master-0 kubenswrapper[7110]: I0313 01:18:23.980156 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:24.087911 master-0 kubenswrapper[7110]: I0313 01:18:24.087850 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-8l7kq_6c88187c-d011-4043-a6d3-4a8a7ec4e204/olm-operator/0.log" Mar 13 01:18:24.262786 master-0 kubenswrapper[7110]: I0313 01:18:24.262502 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-nrzpj_0d4e6150-432c-4a11-b5a6-4d62dd701fc8/kube-rbac-proxy/0.log" Mar 13 01:18:24.403455 master-0 kubenswrapper[7110]: I0313 01:18:24.403303 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b"] Mar 13 01:18:24.414221 master-0 kubenswrapper[7110]: W0313 01:18:24.414173 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b5ad07_fa01_4330_9dce_6da3444657ab.slice/crio-be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82 WatchSource:0}: Error finding container be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82: Status 404 returned error can't find the container with id be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82 Mar 13 01:18:24.419340 master-0 kubenswrapper[7110]: I0313 01:18:24.419103 7110 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:18:24.465694 master-0 kubenswrapper[7110]: I0313 01:18:24.465607 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-nrzpj_0d4e6150-432c-4a11-b5a6-4d62dd701fc8/package-server-manager/0.log" Mar 13 01:18:24.668972 master-0 kubenswrapper[7110]: I0313 01:18:24.668908 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-68f6795949-v9w8g_2ce47660-f7cc-4669-a00d-83422f0f6d55/packageserver/0.log" Mar 13 01:18:24.981500 master-0 kubenswrapper[7110]: I0313 01:18:24.981311 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:24.981500 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:24.981500 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:24.981500 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:24.981500 master-0 kubenswrapper[7110]: I0313 01:18:24.981384 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:25.216509 master-0 kubenswrapper[7110]: I0313 01:18:25.216429 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82"} Mar 13 01:18:25.982656 master-0 kubenswrapper[7110]: I0313 01:18:25.982584 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:25.982656 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:25.982656 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:25.982656 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:25.982656 master-0 kubenswrapper[7110]: I0313 01:18:25.982649 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:26.226426 master-0 kubenswrapper[7110]: I0313 01:18:26.226368 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"b46f9770418285643ca5f7f35c50eeaf7e01ac86d1e69c23bd8c42ec6872497c"} Mar 13 01:18:26.982089 master-0 kubenswrapper[7110]: I0313 01:18:26.981966 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:26.982089 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:26.982089 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:26.982089 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:26.984371 master-0 kubenswrapper[7110]: I0313 01:18:26.982104 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:27.102760 master-0 kubenswrapper[7110]: I0313 01:18:27.101595 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m"] Mar 13 01:18:27.102760 master-0 kubenswrapper[7110]: I0313 01:18:27.101975 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="cluster-cloud-controller-manager" containerID="cri-o://904f10ba4b3a84cf1955d48de30060073ce2c5688519de186736725302931d40" gracePeriod=30 Mar 13 01:18:27.102760 master-0 kubenswrapper[7110]: I0313 01:18:27.102045 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="kube-rbac-proxy" containerID="cri-o://7d1b306a00e2f892ec3846df537645839763a85b452540677bdd0f381e63d686" gracePeriod=30 Mar 13 01:18:27.102760 master-0 kubenswrapper[7110]: I0313 01:18:27.102103 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="config-sync-controllers" containerID="cri-o://a2001794f73a162fbfc6dea653fb7764cf71881334967818dd15d887d2c42880" gracePeriod=30 Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234684 7110 generic.go:334] "Generic (PLEG): container finished" podID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerID="7d1b306a00e2f892ec3846df537645839763a85b452540677bdd0f381e63d686" exitCode=0 Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234715 7110 generic.go:334] "Generic (PLEG): container finished" podID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerID="a2001794f73a162fbfc6dea653fb7764cf71881334967818dd15d887d2c42880" exitCode=0 Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234723 7110 generic.go:334] "Generic (PLEG): container finished" podID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerID="904f10ba4b3a84cf1955d48de30060073ce2c5688519de186736725302931d40" exitCode=0 Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234770 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerDied","Data":"7d1b306a00e2f892ec3846df537645839763a85b452540677bdd0f381e63d686"} Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234796 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerDied","Data":"a2001794f73a162fbfc6dea653fb7764cf71881334967818dd15d887d2c42880"} Mar 13 01:18:27.234821 master-0 kubenswrapper[7110]: I0313 01:18:27.234811 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerDied","Data":"904f10ba4b3a84cf1955d48de30060073ce2c5688519de186736725302931d40"} Mar 13 01:18:27.236398 master-0 kubenswrapper[7110]: I0313 01:18:27.236341 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"0e4f55e2b8073de5d9074b681c317fe1d3f5790a0689fd003d0ff5fc7da43c76"} Mar 13 01:18:27.264614 master-0 kubenswrapper[7110]: I0313 01:18:27.264496 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" podStartSLOduration=2.7020210220000003 podStartE2EDuration="4.264443799s" podCreationTimestamp="2026-03-13 01:18:23 +0000 UTC" firstStartedPulling="2026-03-13 01:18:24.419027355 +0000 UTC m=+305.704053841" lastFinishedPulling="2026-03-13 01:18:25.981450152 +0000 UTC m=+307.266476618" observedRunningTime="2026-03-13 01:18:27.259804077 +0000 UTC m=+308.544830623" watchObservedRunningTime="2026-03-13 01:18:27.264443799 +0000 UTC m=+308.549470305" Mar 13 01:18:27.292062 master-0 kubenswrapper[7110]: I0313 01:18:27.291998 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:27.399768 master-0 kubenswrapper[7110]: I0313 01:18:27.399691 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpdx4\" (UniqueName: \"kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4\") pod \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.399854 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images\") pod \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.399896 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls\") pod \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.399934 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config\") pod \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.399965 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube\") pod \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\" (UID: \"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68\") " Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.400168 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" (UID: "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.400710 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images" (OuterVolumeSpecName: "images") pod "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" (UID: "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:18:27.400942 master-0 kubenswrapper[7110]: I0313 01:18:27.400826 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" (UID: "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:18:27.402043 master-0 kubenswrapper[7110]: I0313 01:18:27.401346 7110 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:27.402043 master-0 kubenswrapper[7110]: I0313 01:18:27.401405 7110 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:27.402043 master-0 kubenswrapper[7110]: I0313 01:18:27.401425 7110 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-images\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:27.415365 master-0 kubenswrapper[7110]: I0313 01:18:27.415324 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" (UID: "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:18:27.415365 master-0 kubenswrapper[7110]: I0313 01:18:27.415328 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4" (OuterVolumeSpecName: "kube-api-access-xpdx4") pod "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" (UID: "372c7cb7-30fc-4575-8bb6-b1d68d9ffe68"). InnerVolumeSpecName "kube-api-access-xpdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:18:27.502981 master-0 kubenswrapper[7110]: I0313 01:18:27.502773 7110 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:27.502981 master-0 kubenswrapper[7110]: I0313 01:18:27.502836 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpdx4\" (UniqueName: \"kubernetes.io/projected/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68-kube-api-access-xpdx4\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:27.981750 master-0 kubenswrapper[7110]: I0313 01:18:27.981601 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:27.981750 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:27.981750 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:27.981750 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:27.981750 master-0 kubenswrapper[7110]: I0313 01:18:27.981737 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:28.249100 master-0 kubenswrapper[7110]: I0313 01:18:28.248950 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" Mar 13 01:18:28.249100 master-0 kubenswrapper[7110]: I0313 01:18:28.248940 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m" event={"ID":"372c7cb7-30fc-4575-8bb6-b1d68d9ffe68","Type":"ContainerDied","Data":"87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d"} Mar 13 01:18:28.249100 master-0 kubenswrapper[7110]: I0313 01:18:28.249015 7110 scope.go:117] "RemoveContainer" containerID="7d1b306a00e2f892ec3846df537645839763a85b452540677bdd0f381e63d686" Mar 13 01:18:28.271982 master-0 kubenswrapper[7110]: I0313 01:18:28.271919 7110 scope.go:117] "RemoveContainer" containerID="a2001794f73a162fbfc6dea653fb7764cf71881334967818dd15d887d2c42880" Mar 13 01:18:28.300111 master-0 kubenswrapper[7110]: I0313 01:18:28.299689 7110 scope.go:117] "RemoveContainer" containerID="904f10ba4b3a84cf1955d48de30060073ce2c5688519de186736725302931d40" Mar 13 01:18:28.359587 master-0 kubenswrapper[7110]: I0313 01:18:28.359513 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m"] Mar 13 01:18:28.395981 master-0 kubenswrapper[7110]: I0313 01:18:28.395894 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-lnm8m"] Mar 13 01:18:28.456466 master-0 kubenswrapper[7110]: I0313 01:18:28.456390 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n"] Mar 13 01:18:28.456888 master-0 kubenswrapper[7110]: E0313 01:18:28.456852 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="config-sync-controllers" Mar 13 01:18:28.456888 master-0 kubenswrapper[7110]: I0313 01:18:28.456882 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="config-sync-controllers" Mar 13 01:18:28.456888 master-0 kubenswrapper[7110]: E0313 01:18:28.456893 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="kube-rbac-proxy" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: I0313 01:18:28.456901 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="kube-rbac-proxy" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: E0313 01:18:28.456917 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="cluster-cloud-controller-manager" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: I0313 01:18:28.456925 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="cluster-cloud-controller-manager" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: I0313 01:18:28.457071 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="kube-rbac-proxy" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: I0313 01:18:28.457087 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="config-sync-controllers" Mar 13 01:18:28.457138 master-0 kubenswrapper[7110]: I0313 01:18:28.457104 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" containerName="cluster-cloud-controller-manager" Mar 13 01:18:28.458077 master-0 kubenswrapper[7110]: I0313 01:18:28.458038 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.465434 master-0 kubenswrapper[7110]: I0313 01:18:28.465400 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:18:28.465522 master-0 kubenswrapper[7110]: I0313 01:18:28.465436 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sb45h" Mar 13 01:18:28.465719 master-0 kubenswrapper[7110]: I0313 01:18:28.465625 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 01:18:28.465872 master-0 kubenswrapper[7110]: I0313 01:18:28.465843 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:18:28.465930 master-0 kubenswrapper[7110]: I0313 01:18:28.465880 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 01:18:28.470489 master-0 kubenswrapper[7110]: I0313 01:18:28.470461 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 01:18:28.629741 master-0 kubenswrapper[7110]: I0313 01:18:28.627402 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.629741 master-0 kubenswrapper[7110]: I0313 01:18:28.627478 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.629741 master-0 kubenswrapper[7110]: I0313 01:18:28.627588 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.629741 master-0 kubenswrapper[7110]: I0313 01:18:28.627735 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sqh\" (UniqueName: \"kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.629741 master-0 kubenswrapper[7110]: I0313 01:18:28.628227 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.729722 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.730005 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sqh\" (UniqueName: \"kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.730435 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.730479 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.730526 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.730674 master-0 kubenswrapper[7110]: I0313 01:18:28.730668 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.731258 master-0 kubenswrapper[7110]: I0313 01:18:28.730805 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.733700 master-0 kubenswrapper[7110]: I0313 01:18:28.731437 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.735504 master-0 kubenswrapper[7110]: I0313 01:18:28.734275 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.748826 master-0 kubenswrapper[7110]: I0313 01:18:28.748783 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sqh\" (UniqueName: \"kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.779865 master-0 kubenswrapper[7110]: I0313 01:18:28.779572 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:18:28.923862 master-0 kubenswrapper[7110]: I0313 01:18:28.923324 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372c7cb7-30fc-4575-8bb6-b1d68d9ffe68" path="/var/lib/kubelet/pods/372c7cb7-30fc-4575-8bb6-b1d68d9ffe68/volumes" Mar 13 01:18:28.979941 master-0 kubenswrapper[7110]: I0313 01:18:28.978376 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg"] Mar 13 01:18:28.979941 master-0 kubenswrapper[7110]: I0313 01:18:28.979467 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:28.984096 master-0 kubenswrapper[7110]: I0313 01:18:28.984000 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 01:18:28.984268 master-0 kubenswrapper[7110]: I0313 01:18:28.984245 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-2pmf7" Mar 13 01:18:28.984935 master-0 kubenswrapper[7110]: I0313 01:18:28.984345 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 01:18:28.985319 master-0 kubenswrapper[7110]: I0313 01:18:28.985241 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:28.985319 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:28.985319 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:28.985319 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:28.985319 master-0 kubenswrapper[7110]: I0313 01:18:28.985276 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:28.991303 master-0 kubenswrapper[7110]: I0313 01:18:28.991259 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2hgwj"] Mar 13 01:18:28.998049 master-0 kubenswrapper[7110]: I0313 01:18:28.997972 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd"] Mar 13 01:18:29.000668 master-0 kubenswrapper[7110]: I0313 01:18:28.998877 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.000668 master-0 kubenswrapper[7110]: I0313 01:18:28.999151 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.003194 master-0 kubenswrapper[7110]: I0313 01:18:29.003142 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.003841 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-tkjm8" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.003988 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.004093 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.004201 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.004310 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4lcm6" Mar 13 01:18:29.004516 master-0 kubenswrapper[7110]: I0313 01:18:29.004480 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 01:18:29.006095 master-0 kubenswrapper[7110]: I0313 01:18:29.005304 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg"] Mar 13 01:18:29.024438 master-0 kubenswrapper[7110]: I0313 01:18:29.024396 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd"] Mar 13 01:18:29.140683 master-0 kubenswrapper[7110]: I0313 01:18:29.140569 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.140683 master-0 kubenswrapper[7110]: I0313 01:18:29.140610 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwxhc\" (UniqueName: \"kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.140683 master-0 kubenswrapper[7110]: I0313 01:18:29.140646 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.140683 master-0 kubenswrapper[7110]: I0313 01:18:29.140669 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjrm\" (UniqueName: \"kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.140925 master-0 kubenswrapper[7110]: I0313 01:18:29.140746 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.140925 master-0 kubenswrapper[7110]: I0313 01:18:29.140770 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.140925 master-0 kubenswrapper[7110]: I0313 01:18:29.140893 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.141014 master-0 kubenswrapper[7110]: I0313 01:18:29.140925 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.141014 master-0 kubenswrapper[7110]: I0313 01:18:29.140953 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.141014 master-0 kubenswrapper[7110]: I0313 01:18:29.141003 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.141091 master-0 kubenswrapper[7110]: I0313 01:18:29.141033 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.141091 master-0 kubenswrapper[7110]: I0313 01:18:29.141079 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.141151 master-0 kubenswrapper[7110]: I0313 01:18:29.141102 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8zl\" (UniqueName: \"kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.141151 master-0 kubenswrapper[7110]: I0313 01:18:29.141129 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.141203 master-0 kubenswrapper[7110]: I0313 01:18:29.141149 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.141203 master-0 kubenswrapper[7110]: I0313 01:18:29.141167 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.141203 master-0 kubenswrapper[7110]: I0313 01:18:29.141189 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.141293 master-0 kubenswrapper[7110]: I0313 01:18:29.141211 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.242729 master-0 kubenswrapper[7110]: I0313 01:18:29.242681 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.242970 master-0 kubenswrapper[7110]: I0313 01:18:29.242933 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243017 master-0 kubenswrapper[7110]: I0313 01:18:29.242995 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243061 master-0 kubenswrapper[7110]: I0313 01:18:29.243042 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243102 master-0 kubenswrapper[7110]: I0313 01:18:29.243081 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243236 master-0 kubenswrapper[7110]: I0313 01:18:29.243162 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243236 master-0 kubenswrapper[7110]: I0313 01:18:29.243211 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxhc\" (UniqueName: \"kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243373 master-0 kubenswrapper[7110]: I0313 01:18:29.243247 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243373 master-0 kubenswrapper[7110]: I0313 01:18:29.243279 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjrm\" (UniqueName: \"kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243373 master-0 kubenswrapper[7110]: I0313 01:18:29.243324 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243373 master-0 kubenswrapper[7110]: I0313 01:18:29.243337 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243487 master-0 kubenswrapper[7110]: I0313 01:18:29.243392 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243487 master-0 kubenswrapper[7110]: I0313 01:18:29.243463 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243549 master-0 kubenswrapper[7110]: I0313 01:18:29.243513 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.243579 master-0 kubenswrapper[7110]: I0313 01:18:29.243542 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.243612 master-0 kubenswrapper[7110]: I0313 01:18:29.243578 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.243712 master-0 kubenswrapper[7110]: I0313 01:18:29.243629 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243755 master-0 kubenswrapper[7110]: I0313 01:18:29.243729 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.243798 master-0 kubenswrapper[7110]: I0313 01:18:29.243788 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.243841 master-0 kubenswrapper[7110]: I0313 01:18:29.243826 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8zl\" (UniqueName: \"kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.244073 master-0 kubenswrapper[7110]: I0313 01:18:29.244035 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.245178 master-0 kubenswrapper[7110]: I0313 01:18:29.244890 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.245178 master-0 kubenswrapper[7110]: E0313 01:18:29.244970 7110 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 13 01:18:29.245178 master-0 kubenswrapper[7110]: E0313 01:18:29.245012 7110 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls podName:cd7cca05-3da7-42cf-af64-6e94050e58c0 nodeName:}" failed. No retries permitted until 2026-03-13 01:18:29.744997191 +0000 UTC m=+311.030023657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls") pod "openshift-state-metrics-74cc79fd76-6btfg" (UID: "cd7cca05-3da7-42cf-af64-6e94050e58c0") : secret "openshift-state-metrics-tls" not found Mar 13 01:18:29.245303 master-0 kubenswrapper[7110]: I0313 01:18:29.245210 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.245726 master-0 kubenswrapper[7110]: I0313 01:18:29.245688 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.245903 master-0 kubenswrapper[7110]: I0313 01:18:29.245876 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.246098 master-0 kubenswrapper[7110]: I0313 01:18:29.246078 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.247713 master-0 kubenswrapper[7110]: I0313 01:18:29.247688 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.249513 master-0 kubenswrapper[7110]: I0313 01:18:29.248680 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.250773 master-0 kubenswrapper[7110]: I0313 01:18:29.250251 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.251936 master-0 kubenswrapper[7110]: I0313 01:18:29.251903 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.254182 master-0 kubenswrapper[7110]: I0313 01:18:29.254150 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.264691 master-0 kubenswrapper[7110]: I0313 01:18:29.264659 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxhc\" (UniqueName: \"kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.268715 master-0 kubenswrapper[7110]: I0313 01:18:29.268678 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"c20b7880d0e62c91ace04a400f15380d02a7f587227b0e579de54f8b6b881459"} Mar 13 01:18:29.268779 master-0 kubenswrapper[7110]: I0313 01:18:29.268731 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"b70fd8156b9269ea1d32e5bd6b505f43cc5c2cda9055f9eab294a1ae160205e2"} Mar 13 01:18:29.274416 master-0 kubenswrapper[7110]: I0313 01:18:29.271029 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.274416 master-0 kubenswrapper[7110]: I0313 01:18:29.271898 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8zl\" (UniqueName: \"kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.274416 master-0 kubenswrapper[7110]: I0313 01:18:29.273078 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjrm\" (UniqueName: \"kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.444567 master-0 kubenswrapper[7110]: I0313 01:18:29.444525 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:18:29.462122 master-0 kubenswrapper[7110]: I0313 01:18:29.461783 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:18:29.471444 master-0 kubenswrapper[7110]: W0313 01:18:29.471414 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04470d64_c6eb_4a62_ae75_2a1d3dfdd53a.slice/crio-7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1 WatchSource:0}: Error finding container 7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1: Status 404 returned error can't find the container with id 7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1 Mar 13 01:18:29.773724 master-0 kubenswrapper[7110]: I0313 01:18:29.759313 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.774198 master-0 kubenswrapper[7110]: I0313 01:18:29.773900 7110 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 01:18:29.774333 master-0 kubenswrapper[7110]: I0313 01:18:29.774315 7110 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:18:29.774605 master-0 kubenswrapper[7110]: E0313 01:18:29.774592 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.774693 master-0 kubenswrapper[7110]: I0313 01:18:29.774683 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.774763 master-0 kubenswrapper[7110]: E0313 01:18:29.774753 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.774833 master-0 kubenswrapper[7110]: I0313 01:18:29.774824 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.774889 master-0 kubenswrapper[7110]: E0313 01:18:29.774880 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:18:29.774943 master-0 kubenswrapper[7110]: I0313 01:18:29.774935 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:18:29.775005 master-0 kubenswrapper[7110]: E0313 01:18:29.774996 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775057 master-0 kubenswrapper[7110]: I0313 01:18:29.775049 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775132 master-0 kubenswrapper[7110]: E0313 01:18:29.775119 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775190 master-0 kubenswrapper[7110]: I0313 01:18:29.775181 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775338 master-0 kubenswrapper[7110]: I0313 01:18:29.775327 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775399 master-0 kubenswrapper[7110]: I0313 01:18:29.775390 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:18:29.775456 master-0 kubenswrapper[7110]: I0313 01:18:29.775447 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775508 master-0 kubenswrapper[7110]: I0313 01:18:29.775500 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775567 master-0 kubenswrapper[7110]: I0313 01:18:29.775558 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775618 master-0 kubenswrapper[7110]: I0313 01:18:29.775610 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775817 master-0 kubenswrapper[7110]: E0313 01:18:29.775806 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.775880 master-0 kubenswrapper[7110]: I0313 01:18:29.775871 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:18:29.776031 master-0 kubenswrapper[7110]: I0313 01:18:29.775966 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182" gracePeriod=30 Mar 13 01:18:29.776232 master-0 kubenswrapper[7110]: I0313 01:18:29.776204 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://ae1c74ac713339ebe951cea485ddb317986dccb644eb4d3021ce0d21c709fe41" gracePeriod=30 Mar 13 01:18:29.796662 master-0 kubenswrapper[7110]: I0313 01:18:29.781260 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.797478 master-0 kubenswrapper[7110]: I0313 01:18:29.781415 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:29.875101 master-0 kubenswrapper[7110]: I0313 01:18:29.875019 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.875277 master-0 kubenswrapper[7110]: I0313 01:18:29.875127 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.890915 master-0 kubenswrapper[7110]: I0313 01:18:29.890860 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd"] Mar 13 01:18:29.910238 master-0 kubenswrapper[7110]: W0313 01:18:29.910169 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f7537e_93ed_466b_ba24_78141d004b2f.slice/crio-24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8 WatchSource:0}: Error finding container 24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8: Status 404 returned error can't find the container with id 24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8 Mar 13 01:18:29.925989 master-0 kubenswrapper[7110]: I0313 01:18:29.925947 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:18:29.936222 master-0 kubenswrapper[7110]: I0313 01:18:29.935863 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:18:29.954041 master-0 kubenswrapper[7110]: I0313 01:18:29.952518 7110 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="62fe89be-419b-427f-ba06-a5b009e32680" Mar 13 01:18:29.976943 master-0 kubenswrapper[7110]: I0313 01:18:29.976854 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 01:18:29.976943 master-0 kubenswrapper[7110]: I0313 01:18:29.976926 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 01:18:29.976943 master-0 kubenswrapper[7110]: I0313 01:18:29.976947 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 01:18:29.977200 master-0 kubenswrapper[7110]: I0313 01:18:29.976964 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 01:18:29.977200 master-0 kubenswrapper[7110]: I0313 01:18:29.976990 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 01:18:29.977200 master-0 kubenswrapper[7110]: I0313 01:18:29.977187 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.977301 master-0 kubenswrapper[7110]: I0313 01:18:29.977238 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.977416 master-0 kubenswrapper[7110]: I0313 01:18:29.977329 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.977416 master-0 kubenswrapper[7110]: I0313 01:18:29.977409 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:29.977508 master-0 kubenswrapper[7110]: I0313 01:18:29.977437 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:29.977508 master-0 kubenswrapper[7110]: I0313 01:18:29.977459 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:29.977508 master-0 kubenswrapper[7110]: I0313 01:18:29.977457 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:29.977508 master-0 kubenswrapper[7110]: I0313 01:18:29.977472 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:29.977648 master-0 kubenswrapper[7110]: I0313 01:18:29.977461 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:29.979619 master-0 kubenswrapper[7110]: I0313 01:18:29.979583 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:29.979619 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:29.979619 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:29.979619 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:29.979782 master-0 kubenswrapper[7110]: I0313 01:18:29.979627 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:29.997768 master-0 kubenswrapper[7110]: I0313 01:18:29.997257 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:18:30.078556 master-0 kubenswrapper[7110]: I0313 01:18:30.078488 7110 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:30.078556 master-0 kubenswrapper[7110]: I0313 01:18:30.078525 7110 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:30.078556 master-0 kubenswrapper[7110]: I0313 01:18:30.078536 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:30.078556 master-0 kubenswrapper[7110]: I0313 01:18:30.078545 7110 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:30.078556 master-0 kubenswrapper[7110]: I0313 01:18:30.078553 7110 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:30.230805 master-0 kubenswrapper[7110]: I0313 01:18:30.230662 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:30.299324 master-0 kubenswrapper[7110]: I0313 01:18:30.299233 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1"} Mar 13 01:18:30.303878 master-0 kubenswrapper[7110]: I0313 01:18:30.302929 7110 generic.go:334] "Generic (PLEG): container finished" podID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerID="8803ea2eb582c5693311e889e291d05e3059cc337f89e85079fab8e693f3beb8" exitCode=0 Mar 13 01:18:30.303878 master-0 kubenswrapper[7110]: I0313 01:18:30.303012 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerDied","Data":"8803ea2eb582c5693311e889e291d05e3059cc337f89e85079fab8e693f3beb8"} Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.308009 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="ae1c74ac713339ebe951cea485ddb317986dccb644eb4d3021ce0d21c709fe41" exitCode=0 Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.308032 7110 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182" exitCode=0 Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.308088 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df" Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.308104 7110 scope.go:117] "RemoveContainer" containerID="f90ce908443d9a8a4935d9c526f464812bacbaf55d563a98ad6570bafafa4d36" Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.308114 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.312718 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"9c4066cdb45897ed4f69fcb12c6e6463de2070bbf91b47b272774c2582e358fc"} Mar 13 01:18:30.315122 master-0 kubenswrapper[7110]: I0313 01:18:30.312792 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"705f3b1f8f6a29f9d66d96e7e64284c86692ae92fafef78a3e7d5b5411f4c2b9"} Mar 13 01:18:30.316142 master-0 kubenswrapper[7110]: I0313 01:18:30.315428 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8"} Mar 13 01:18:30.320990 master-0 kubenswrapper[7110]: I0313 01:18:30.320942 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"0072057986d1a9c35e19db3f7ab2650e875b4c3fecae35f046b875511fe06154"} Mar 13 01:18:30.388701 master-0 kubenswrapper[7110]: I0313 01:18:30.387241 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" podStartSLOduration=2.3872260770000002 podStartE2EDuration="2.387226077s" podCreationTimestamp="2026-03-13 01:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:30.386899889 +0000 UTC m=+311.671926355" watchObservedRunningTime="2026-03-13 01:18:30.387226077 +0000 UTC m=+311.672252543" Mar 13 01:18:30.411286 master-0 kubenswrapper[7110]: I0313 01:18:30.411234 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg"] Mar 13 01:18:30.921568 master-0 kubenswrapper[7110]: I0313 01:18:30.920740 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 13 01:18:30.921568 master-0 kubenswrapper[7110]: I0313 01:18:30.921240 7110 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 13 01:18:30.938916 master-0 kubenswrapper[7110]: I0313 01:18:30.938845 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 01:18:30.938916 master-0 kubenswrapper[7110]: I0313 01:18:30.938901 7110 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="62fe89be-419b-427f-ba06-a5b009e32680" Mar 13 01:18:30.941543 master-0 kubenswrapper[7110]: I0313 01:18:30.941490 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 01:18:30.941543 master-0 kubenswrapper[7110]: I0313 01:18:30.941536 7110 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="62fe89be-419b-427f-ba06-a5b009e32680" Mar 13 01:18:30.980309 master-0 kubenswrapper[7110]: I0313 01:18:30.980262 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:30.980309 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:30.980309 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:30.980309 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:30.980608 master-0 kubenswrapper[7110]: I0313 01:18:30.980316 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:31.333072 master-0 kubenswrapper[7110]: I0313 01:18:31.333000 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"ea64d3b4313780d7d13a3dab7935308441248f41376a68c4808300e2ebba56b2"} Mar 13 01:18:31.334116 master-0 kubenswrapper[7110]: I0313 01:18:31.334043 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"78bc15b632613235d2195bdf740a5aaab2a5677c4f8c20084e1234dbfa6c8a91"} Mar 13 01:18:31.334274 master-0 kubenswrapper[7110]: I0313 01:18:31.334081 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"b26809d00df2d88f0387eef7498f3d90150a196ebaa102f4f43bf51209c487a9"} Mar 13 01:18:31.337995 master-0 kubenswrapper[7110]: I0313 01:18:31.337928 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"1f339c4da756baa27443d470023f6e1410367639df127c26cecf3952f778ca16"} Mar 13 01:18:31.639144 master-0 kubenswrapper[7110]: I0313 01:18:31.639117 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:18:31.698108 master-0 kubenswrapper[7110]: I0313 01:18:31.698060 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock\") pod \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " Mar 13 01:18:31.698265 master-0 kubenswrapper[7110]: I0313 01:18:31.698167 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access\") pod \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " Mar 13 01:18:31.698265 master-0 kubenswrapper[7110]: I0313 01:18:31.698235 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir\") pod \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\" (UID: \"728435e4-9fdb-4fea-9f5b-eb5ff5444da0\") " Mar 13 01:18:31.698716 master-0 kubenswrapper[7110]: I0313 01:18:31.698687 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "728435e4-9fdb-4fea-9f5b-eb5ff5444da0" (UID: "728435e4-9fdb-4fea-9f5b-eb5ff5444da0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:31.698760 master-0 kubenswrapper[7110]: I0313 01:18:31.698718 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock" (OuterVolumeSpecName: "var-lock") pod "728435e4-9fdb-4fea-9f5b-eb5ff5444da0" (UID: "728435e4-9fdb-4fea-9f5b-eb5ff5444da0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:31.720062 master-0 kubenswrapper[7110]: I0313 01:18:31.719706 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "728435e4-9fdb-4fea-9f5b-eb5ff5444da0" (UID: "728435e4-9fdb-4fea-9f5b-eb5ff5444da0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:18:31.799872 master-0 kubenswrapper[7110]: I0313 01:18:31.799843 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:31.799977 master-0 kubenswrapper[7110]: I0313 01:18:31.799961 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:31.800126 master-0 kubenswrapper[7110]: I0313 01:18:31.800115 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/728435e4-9fdb-4fea-9f5b-eb5ff5444da0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:31.980797 master-0 kubenswrapper[7110]: I0313 01:18:31.980372 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:31.980797 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:31.980797 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:31.980797 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:31.980797 master-0 kubenswrapper[7110]: I0313 01:18:31.980436 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:32.366904 master-0 kubenswrapper[7110]: I0313 01:18:32.366830 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"9d118311f33a13bf01d58b99e2e28890870103c6d6d9e80b3f327feb4a6e5c10"} Mar 13 01:18:32.366904 master-0 kubenswrapper[7110]: I0313 01:18:32.366897 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"7146d2748a69888b0e230f968d6a455dc052e3a4f925338980f5ac24afb23fd4"} Mar 13 01:18:32.366904 master-0 kubenswrapper[7110]: I0313 01:18:32.366920 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"6ecf1cf9a4925a48c1305c992f6b26c6dc5493f27b0413a75a2a0cbd559a27b9"} Mar 13 01:18:32.377102 master-0 kubenswrapper[7110]: I0313 01:18:32.377046 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"5facc13367bd2fec27a111e6734950591c5fb3c40b9c12943601a819d288d978"} Mar 13 01:18:32.377102 master-0 kubenswrapper[7110]: I0313 01:18:32.377097 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"d65bf4cd73d878c1128e2da34864164f7258ebf2fe36dafd0cbc33e6915ed700"} Mar 13 01:18:32.377102 master-0 kubenswrapper[7110]: I0313 01:18:32.377111 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"47c4abc061aa37ed56eb936e84b7d539b1fd1e8cec9bc0cb2e371456dc167bdc"} Mar 13 01:18:32.379162 master-0 kubenswrapper[7110]: I0313 01:18:32.378693 7110 generic.go:334] "Generic (PLEG): container finished" podID="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" containerID="b73ff7b10ef47505ecf38b484d44b39c71e1d9b7b2e9d15b9f215185c43203db" exitCode=0 Mar 13 01:18:32.379162 master-0 kubenswrapper[7110]: I0313 01:18:32.378785 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerDied","Data":"b73ff7b10ef47505ecf38b484d44b39c71e1d9b7b2e9d15b9f215185c43203db"} Mar 13 01:18:32.393186 master-0 kubenswrapper[7110]: I0313 01:18:32.384996 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerDied","Data":"2a77f8e58ee6b4b9d8e8d1a5c1202e86b111b4dbd37bf30068295cac4daecf86"} Mar 13 01:18:32.393186 master-0 kubenswrapper[7110]: I0313 01:18:32.385046 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a77f8e58ee6b4b9d8e8d1a5c1202e86b111b4dbd37bf30068295cac4daecf86" Mar 13 01:18:32.393186 master-0 kubenswrapper[7110]: I0313 01:18:32.385116 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:18:32.396687 master-0 kubenswrapper[7110]: I0313 01:18:32.394955 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=3.394937419 podStartE2EDuration="3.394937419s" podCreationTimestamp="2026-03-13 01:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:18:32.392461405 +0000 UTC m=+313.677487881" watchObservedRunningTime="2026-03-13 01:18:32.394937419 +0000 UTC m=+313.679963895" Mar 13 01:18:32.465081 master-0 kubenswrapper[7110]: I0313 01:18:32.463313 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" podStartSLOduration=3.197897631 podStartE2EDuration="4.463293217s" podCreationTimestamp="2026-03-13 01:18:28 +0000 UTC" firstStartedPulling="2026-03-13 01:18:29.916074614 +0000 UTC m=+311.201101080" lastFinishedPulling="2026-03-13 01:18:31.18147021 +0000 UTC m=+312.466496666" observedRunningTime="2026-03-13 01:18:32.461804738 +0000 UTC m=+313.746831204" watchObservedRunningTime="2026-03-13 01:18:32.463293217 +0000 UTC m=+313.748319683" Mar 13 01:18:32.979715 master-0 kubenswrapper[7110]: I0313 01:18:32.979545 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:32.979715 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:32.979715 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:32.979715 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:32.979715 master-0 kubenswrapper[7110]: I0313 01:18:32.979666 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:33.397036 master-0 kubenswrapper[7110]: I0313 01:18:33.396855 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"f0ef6b6be1bf6464d80a4ed0d8027b70cb9fbd6888ed521a07d3f244cf4ef4f1"} Mar 13 01:18:33.397036 master-0 kubenswrapper[7110]: I0313 01:18:33.396930 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"a8891c3de0ea3f8634f05d0f83839b839fc776ecaca857a79f015b6bf51d787a"} Mar 13 01:18:33.400767 master-0 kubenswrapper[7110]: I0313 01:18:33.400711 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"f0fd861fec3dc2f0ce9169e8cf3c411b63bb224d503bc3cc3463cc4f3e8118f2"} Mar 13 01:18:33.463226 master-0 kubenswrapper[7110]: I0313 01:18:33.463134 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" podStartSLOduration=4.053772776 podStartE2EDuration="5.463105308s" podCreationTimestamp="2026-03-13 01:18:28 +0000 UTC" firstStartedPulling="2026-03-13 01:18:30.894181376 +0000 UTC m=+312.179207862" lastFinishedPulling="2026-03-13 01:18:32.303513898 +0000 UTC m=+313.588540394" observedRunningTime="2026-03-13 01:18:33.461688961 +0000 UTC m=+314.746715467" watchObservedRunningTime="2026-03-13 01:18:33.463105308 +0000 UTC m=+314.748131814" Mar 13 01:18:33.464954 master-0 kubenswrapper[7110]: I0313 01:18:33.464897 7110 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2hgwj" podStartSLOduration=3.776416683 podStartE2EDuration="5.464883585s" podCreationTimestamp="2026-03-13 01:18:28 +0000 UTC" firstStartedPulling="2026-03-13 01:18:29.489820785 +0000 UTC m=+310.774847251" lastFinishedPulling="2026-03-13 01:18:31.178287687 +0000 UTC m=+312.463314153" observedRunningTime="2026-03-13 01:18:33.432813886 +0000 UTC m=+314.717840432" watchObservedRunningTime="2026-03-13 01:18:33.464883585 +0000 UTC m=+314.749910081" Mar 13 01:18:33.981535 master-0 kubenswrapper[7110]: I0313 01:18:33.981435 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:33.981535 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:33.981535 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:33.981535 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:33.981535 master-0 kubenswrapper[7110]: I0313 01:18:33.981527 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:34.980435 master-0 kubenswrapper[7110]: I0313 01:18:34.980358 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:34.980435 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:34.980435 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:34.980435 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:34.980435 master-0 kubenswrapper[7110]: I0313 01:18:34.980456 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:35.980741 master-0 kubenswrapper[7110]: I0313 01:18:35.980679 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:35.980741 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:35.980741 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:35.980741 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:35.981433 master-0 kubenswrapper[7110]: I0313 01:18:35.980776 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:36.981274 master-0 kubenswrapper[7110]: I0313 01:18:36.981198 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:36.981274 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:36.981274 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:36.981274 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:36.982174 master-0 kubenswrapper[7110]: I0313 01:18:36.981281 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:37.983039 master-0 kubenswrapper[7110]: I0313 01:18:37.980716 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:37.983039 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:37.983039 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:37.983039 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:37.983039 master-0 kubenswrapper[7110]: I0313 01:18:37.980776 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:38.981346 master-0 kubenswrapper[7110]: I0313 01:18:38.981251 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:38.981346 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:38.981346 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:38.981346 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:38.981834 master-0 kubenswrapper[7110]: I0313 01:18:38.981352 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:39.453119 master-0 kubenswrapper[7110]: I0313 01:18:39.453058 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-tq7n6_2bd94289-7109-4419-9a51-bd289082b9f5/multus-admission-controller/0.log" Mar 13 01:18:39.453866 master-0 kubenswrapper[7110]: I0313 01:18:39.453144 7110 generic.go:334] "Generic (PLEG): container finished" podID="2bd94289-7109-4419-9a51-bd289082b9f5" containerID="cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41" exitCode=137 Mar 13 01:18:39.453866 master-0 kubenswrapper[7110]: I0313 01:18:39.453194 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerDied","Data":"cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41"} Mar 13 01:18:39.453866 master-0 kubenswrapper[7110]: I0313 01:18:39.453237 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" event={"ID":"2bd94289-7109-4419-9a51-bd289082b9f5","Type":"ContainerDied","Data":"ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07"} Mar 13 01:18:39.453866 master-0 kubenswrapper[7110]: I0313 01:18:39.453262 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07" Mar 13 01:18:39.476909 master-0 kubenswrapper[7110]: I0313 01:18:39.476767 7110 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-tq7n6_2bd94289-7109-4419-9a51-bd289082b9f5/multus-admission-controller/0.log" Mar 13 01:18:39.476909 master-0 kubenswrapper[7110]: I0313 01:18:39.476836 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:18:39.623456 master-0 kubenswrapper[7110]: I0313 01:18:39.623354 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") pod \"2bd94289-7109-4419-9a51-bd289082b9f5\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " Mar 13 01:18:39.623773 master-0 kubenswrapper[7110]: I0313 01:18:39.623668 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") pod \"2bd94289-7109-4419-9a51-bd289082b9f5\" (UID: \"2bd94289-7109-4419-9a51-bd289082b9f5\") " Mar 13 01:18:39.627957 master-0 kubenswrapper[7110]: I0313 01:18:39.627881 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "2bd94289-7109-4419-9a51-bd289082b9f5" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:18:39.628953 master-0 kubenswrapper[7110]: I0313 01:18:39.628884 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx" (OuterVolumeSpecName: "kube-api-access-fx5mx") pod "2bd94289-7109-4419-9a51-bd289082b9f5" (UID: "2bd94289-7109-4419-9a51-bd289082b9f5"). InnerVolumeSpecName "kube-api-access-fx5mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:18:39.726329 master-0 kubenswrapper[7110]: I0313 01:18:39.726042 7110 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bd94289-7109-4419-9a51-bd289082b9f5-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:39.726329 master-0 kubenswrapper[7110]: I0313 01:18:39.726093 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx5mx\" (UniqueName: \"kubernetes.io/projected/2bd94289-7109-4419-9a51-bd289082b9f5-kube-api-access-fx5mx\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:39.987963 master-0 kubenswrapper[7110]: I0313 01:18:39.984717 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:39.987963 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:39.987963 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:39.987963 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:39.987963 master-0 kubenswrapper[7110]: I0313 01:18:39.984821 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:40.230900 master-0 kubenswrapper[7110]: I0313 01:18:40.230807 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.230900 master-0 kubenswrapper[7110]: I0313 01:18:40.230862 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.230900 master-0 kubenswrapper[7110]: I0313 01:18:40.230879 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.231314 master-0 kubenswrapper[7110]: I0313 01:18:40.231175 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.238093 master-0 kubenswrapper[7110]: I0313 01:18:40.237925 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.239033 master-0 kubenswrapper[7110]: I0313 01:18:40.238980 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.460853 master-0 kubenswrapper[7110]: I0313 01:18:40.460772 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-tq7n6" Mar 13 01:18:40.467366 master-0 kubenswrapper[7110]: I0313 01:18:40.467314 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:40.517406 master-0 kubenswrapper[7110]: I0313 01:18:40.517259 7110 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:18:40.525119 master-0 kubenswrapper[7110]: I0313 01:18:40.525059 7110 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-tq7n6"] Mar 13 01:18:40.920244 master-0 kubenswrapper[7110]: I0313 01:18:40.920159 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" path="/var/lib/kubelet/pods/2bd94289-7109-4419-9a51-bd289082b9f5/volumes" Mar 13 01:18:40.980739 master-0 kubenswrapper[7110]: I0313 01:18:40.980615 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:40.980739 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:40.980739 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:40.980739 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:40.981359 master-0 kubenswrapper[7110]: I0313 01:18:40.980772 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:41.475618 master-0 kubenswrapper[7110]: I0313 01:18:41.475531 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:18:41.980086 master-0 kubenswrapper[7110]: I0313 01:18:41.980020 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:41.980086 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:41.980086 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:41.980086 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:41.980086 master-0 kubenswrapper[7110]: I0313 01:18:41.980078 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:42.980765 master-0 kubenswrapper[7110]: I0313 01:18:42.980676 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:42.980765 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:42.980765 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:42.980765 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:42.980765 master-0 kubenswrapper[7110]: I0313 01:18:42.980787 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:43.980486 master-0 kubenswrapper[7110]: I0313 01:18:43.980393 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:43.980486 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:43.980486 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:43.980486 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:43.981416 master-0 kubenswrapper[7110]: I0313 01:18:43.980486 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:44.980457 master-0 kubenswrapper[7110]: I0313 01:18:44.980384 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:44.980457 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:44.980457 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:44.980457 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:44.980910 master-0 kubenswrapper[7110]: I0313 01:18:44.980471 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:45.980410 master-0 kubenswrapper[7110]: I0313 01:18:45.980358 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:45.980410 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:45.980410 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:45.980410 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:45.980843 master-0 kubenswrapper[7110]: I0313 01:18:45.980811 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:46.979784 master-0 kubenswrapper[7110]: I0313 01:18:46.979719 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:46.979784 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:46.979784 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:46.979784 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:46.980429 master-0 kubenswrapper[7110]: I0313 01:18:46.979816 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:47.980867 master-0 kubenswrapper[7110]: I0313 01:18:47.980787 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:47.980867 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:47.980867 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:47.980867 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:47.981954 master-0 kubenswrapper[7110]: I0313 01:18:47.980881 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:48.981466 master-0 kubenswrapper[7110]: I0313 01:18:48.981374 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:48.981466 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:48.981466 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:48.981466 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:48.982471 master-0 kubenswrapper[7110]: I0313 01:18:48.981456 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.980448 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.980510 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.988890 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5575f756f4-hqr5q"] Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: E0313 01:18:49.989321 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989338 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: E0313 01:18:49.989369 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989376 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: E0313 01:18:49.989390 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989398 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989566 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989583 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.989590 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:18:49.993085 master-0 kubenswrapper[7110]: I0313 01:18:49.990080 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.003656 master-0 kubenswrapper[7110]: I0313 01:18:50.002139 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-h6wj2" Mar 13 01:18:50.003656 master-0 kubenswrapper[7110]: I0313 01:18:50.002319 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 01:18:50.003656 master-0 kubenswrapper[7110]: I0313 01:18:50.002434 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-b1qe6h41gh39q" Mar 13 01:18:50.003656 master-0 kubenswrapper[7110]: I0313 01:18:50.002527 7110 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 01:18:50.003656 master-0 kubenswrapper[7110]: I0313 01:18:50.002620 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 01:18:50.010340 master-0 kubenswrapper[7110]: I0313 01:18:50.010291 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.010583 master-0 kubenswrapper[7110]: I0313 01:18:50.010569 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.010702 master-0 kubenswrapper[7110]: I0313 01:18:50.010687 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.010827 master-0 kubenswrapper[7110]: I0313 01:18:50.010815 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.010918 master-0 kubenswrapper[7110]: I0313 01:18:50.010906 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.010991 master-0 kubenswrapper[7110]: I0313 01:18:50.010977 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.011199 master-0 kubenswrapper[7110]: I0313 01:18:50.011185 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.021656 master-0 kubenswrapper[7110]: I0313 01:18:50.019811 7110 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5575f756f4-hqr5q"] Mar 13 01:18:50.021656 master-0 kubenswrapper[7110]: I0313 01:18:50.020913 7110 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 01:18:50.052688 master-0 kubenswrapper[7110]: I0313 01:18:50.052500 7110 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 01:18:50.052874 master-0 kubenswrapper[7110]: I0313 01:18:50.052732 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172" gracePeriod=15 Mar 13 01:18:50.052874 master-0 kubenswrapper[7110]: I0313 01:18:50.052858 7110 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60" gracePeriod=15 Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.053910 7110 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: E0313 01:18:50.054179 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054193 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: E0313 01:18:50.054212 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054219 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: E0313 01:18:50.054231 7110 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054239 7110 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054376 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054393 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 01:18:50.054840 master-0 kubenswrapper[7110]: I0313 01:18:50.054401 7110 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 01:18:50.056077 master-0 kubenswrapper[7110]: I0313 01:18:50.056053 7110 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:18:50.057027 master-0 kubenswrapper[7110]: I0313 01:18:50.056916 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.057385 master-0 kubenswrapper[7110]: I0313 01:18:50.057365 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112005 master-0 kubenswrapper[7110]: I0313 01:18:50.111957 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112029 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112065 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112084 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112105 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112121 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112137 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112155 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112175 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112192 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112204 master-0 kubenswrapper[7110]: I0313 01:18:50.112213 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112646 master-0 kubenswrapper[7110]: I0313 01:18:50.112229 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.112646 master-0 kubenswrapper[7110]: I0313 01:18:50.112250 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112646 master-0 kubenswrapper[7110]: I0313 01:18:50.112268 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.112646 master-0 kubenswrapper[7110]: I0313 01:18:50.112289 7110 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.114494 master-0 kubenswrapper[7110]: I0313 01:18:50.113020 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.114494 master-0 kubenswrapper[7110]: I0313 01:18:50.114105 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.114494 master-0 kubenswrapper[7110]: I0313 01:18:50.114190 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.117333 master-0 kubenswrapper[7110]: I0313 01:18:50.117046 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.117333 master-0 kubenswrapper[7110]: I0313 01:18:50.117303 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.117939 master-0 kubenswrapper[7110]: I0313 01:18:50.117914 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.128041 master-0 kubenswrapper[7110]: I0313 01:18:50.127673 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:18:50.135729 master-0 kubenswrapper[7110]: I0313 01:18:50.135683 7110 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:18:50.152362 master-0 kubenswrapper[7110]: I0313 01:18:50.152328 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.213324 master-0 kubenswrapper[7110]: I0313 01:18:50.213264 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213324 master-0 kubenswrapper[7110]: I0313 01:18:50.213310 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213529 master-0 kubenswrapper[7110]: I0313 01:18:50.213388 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213529 master-0 kubenswrapper[7110]: I0313 01:18:50.213424 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213529 master-0 kubenswrapper[7110]: I0313 01:18:50.213442 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213529 master-0 kubenswrapper[7110]: I0313 01:18:50.213470 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213529 master-0 kubenswrapper[7110]: I0313 01:18:50.213500 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.213748 master-0 kubenswrapper[7110]: I0313 01:18:50.213531 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213748 master-0 kubenswrapper[7110]: I0313 01:18:50.213611 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213748 master-0 kubenswrapper[7110]: I0313 01:18:50.213689 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213871 master-0 kubenswrapper[7110]: I0313 01:18:50.213749 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213871 master-0 kubenswrapper[7110]: I0313 01:18:50.213791 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213871 master-0 kubenswrapper[7110]: I0313 01:18:50.213810 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213871 master-0 kubenswrapper[7110]: I0313 01:18:50.213839 7110 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.213871 master-0 kubenswrapper[7110]: I0313 01:18:50.213793 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.214028 master-0 kubenswrapper[7110]: I0313 01:18:50.213935 7110 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.336375 master-0 kubenswrapper[7110]: I0313 01:18:50.336314 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:50.432838 master-0 kubenswrapper[7110]: I0313 01:18:50.422293 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:18:50.432838 master-0 kubenswrapper[7110]: I0313 01:18:50.424752 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:50.472287 master-0 kubenswrapper[7110]: W0313 01:18:50.471953 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134 WatchSource:0}: Error finding container 616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134: Status 404 returned error can't find the container with id 616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134 Mar 13 01:18:50.479160 master-0 kubenswrapper[7110]: E0313 01:18:50.479024 7110 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c41d3a149fcda openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:f417e14665db2ffffa887ce21c9ff0ed,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:18:50.47701833 +0000 UTC m=+331.762044796,LastTimestamp:2026-03-13 01:18:50.47701833 +0000 UTC m=+331.762044796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:18:50.556318 master-0 kubenswrapper[7110]: I0313 01:18:50.556280 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"92c3b2f339c88995c70507230cfa25808d3c4399c710b2454c51839f6048ccf5"} Mar 13 01:18:50.557658 master-0 kubenswrapper[7110]: I0313 01:18:50.557618 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134"} Mar 13 01:18:50.568417 master-0 kubenswrapper[7110]: I0313 01:18:50.568345 7110 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60" exitCode=0 Mar 13 01:18:50.571577 master-0 kubenswrapper[7110]: I0313 01:18:50.571332 7110 generic.go:334] "Generic (PLEG): container finished" podID="690f916b-6f87-42d9-8168-392a9177bee9" containerID="f3c19acecbccf7bd6716e7d44a9b0fc9bb63ca007ca5d04b416b934ef2cbe52c" exitCode=0 Mar 13 01:18:50.571577 master-0 kubenswrapper[7110]: I0313 01:18:50.571373 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerDied","Data":"f3c19acecbccf7bd6716e7d44a9b0fc9bb63ca007ca5d04b416b934ef2cbe52c"} Mar 13 01:18:50.575900 master-0 kubenswrapper[7110]: I0313 01:18:50.575686 7110 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:50.582237 master-0 kubenswrapper[7110]: I0313 01:18:50.581061 7110 status_manager.go:851] "Failed to get status for pod" podUID="690f916b-6f87-42d9-8168-392a9177bee9" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:50.582456 master-0 kubenswrapper[7110]: I0313 01:18:50.582428 7110 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:50.712471 master-0 kubenswrapper[7110]: I0313 01:18:50.712440 7110 patch_prober.go:28] interesting pod/bootstrap-kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" start-of-body= Mar 13 01:18:50.712574 master-0 kubenswrapper[7110]: I0313 01:18:50.712486 7110 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.32.10:6443/readyz\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:50.980365 master-0 kubenswrapper[7110]: I0313 01:18:50.980304 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:50.980365 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:50.980365 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:50.980365 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:50.980690 master-0 kubenswrapper[7110]: I0313 01:18:50.980369 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: E0313 01:18:51.052185 7110 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_metrics-server-5575f756f4-hqr5q_openshift-monitoring_9db888f0-51b6-43cf-8337-69d2d5cc2b0a_0(8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde): error adding pod openshift-monitoring_metrics-server-5575f756f4-hqr5q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde" Netns:"/var/run/netns/4e02c9cc-7d1e-43ec-97d9-489e3aba7355" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=metrics-server-5575f756f4-hqr5q;K8S_POD_INFRA_CONTAINER_ID=8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde;K8S_POD_UID=9db888f0-51b6-43cf-8337-69d2d5cc2b0a" Path:"" ERRORED: error configuring pod [openshift-monitoring/metrics-server-5575f756f4-hqr5q] networking: Multus: [openshift-monitoring/metrics-server-5575f756f4-hqr5q/9db888f0-51b6-43cf-8337-69d2d5cc2b0a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: SetNetworkStatus: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/metrics-server-5575f756f4-hqr5q?timeout=1m0s": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: > Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: E0313 01:18:51.052252 7110 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_metrics-server-5575f756f4-hqr5q_openshift-monitoring_9db888f0-51b6-43cf-8337-69d2d5cc2b0a_0(8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde): error adding pod openshift-monitoring_metrics-server-5575f756f4-hqr5q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde" Netns:"/var/run/netns/4e02c9cc-7d1e-43ec-97d9-489e3aba7355" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=metrics-server-5575f756f4-hqr5q;K8S_POD_INFRA_CONTAINER_ID=8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde;K8S_POD_UID=9db888f0-51b6-43cf-8337-69d2d5cc2b0a" Path:"" ERRORED: error configuring pod [openshift-monitoring/metrics-server-5575f756f4-hqr5q] networking: Multus: [openshift-monitoring/metrics-server-5575f756f4-hqr5q/9db888f0-51b6-43cf-8337-69d2d5cc2b0a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: SetNetworkStatus: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/metrics-server-5575f756f4-hqr5q?timeout=1m0s": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: > pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: E0313 01:18:51.052272 7110 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_metrics-server-5575f756f4-hqr5q_openshift-monitoring_9db888f0-51b6-43cf-8337-69d2d5cc2b0a_0(8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde): error adding pod openshift-monitoring_metrics-server-5575f756f4-hqr5q to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde" Netns:"/var/run/netns/4e02c9cc-7d1e-43ec-97d9-489e3aba7355" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=metrics-server-5575f756f4-hqr5q;K8S_POD_INFRA_CONTAINER_ID=8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde;K8S_POD_UID=9db888f0-51b6-43cf-8337-69d2d5cc2b0a" Path:"" ERRORED: error configuring pod [openshift-monitoring/metrics-server-5575f756f4-hqr5q] networking: Multus: [openshift-monitoring/metrics-server-5575f756f4-hqr5q/9db888f0-51b6-43cf-8337-69d2d5cc2b0a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: SetNetworkStatus: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/metrics-server-5575f756f4-hqr5q?timeout=1m0s": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: > pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:51.053123 master-0 kubenswrapper[7110]: E0313 01:18:51.052333 7110 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"metrics-server-5575f756f4-hqr5q_openshift-monitoring(9db888f0-51b6-43cf-8337-69d2d5cc2b0a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"metrics-server-5575f756f4-hqr5q_openshift-monitoring(9db888f0-51b6-43cf-8337-69d2d5cc2b0a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_metrics-server-5575f756f4-hqr5q_openshift-monitoring_9db888f0-51b6-43cf-8337-69d2d5cc2b0a_0(8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde): error adding pod openshift-monitoring_metrics-server-5575f756f4-hqr5q to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde\\\" Netns:\\\"/var/run/netns/4e02c9cc-7d1e-43ec-97d9-489e3aba7355\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=metrics-server-5575f756f4-hqr5q;K8S_POD_INFRA_CONTAINER_ID=8994edf1932856eca903b5f718841bdb7517c9ca02b4ea313e0c6be508cc7fde;K8S_POD_UID=9db888f0-51b6-43cf-8337-69d2d5cc2b0a\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-monitoring/metrics-server-5575f756f4-hqr5q] networking: Multus: [openshift-monitoring/metrics-server-5575f756f4-hqr5q/9db888f0-51b6-43cf-8337-69d2d5cc2b0a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: SetNetworkStatus: failed to update the pod metrics-server-5575f756f4-hqr5q in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/metrics-server-5575f756f4-hqr5q?timeout=1m0s\\\": dial tcp 192.168.32.10:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" Mar 13 01:18:51.579439 master-0 kubenswrapper[7110]: I0313 01:18:51.579380 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de"} Mar 13 01:18:51.580529 master-0 kubenswrapper[7110]: I0313 01:18:51.580471 7110 status_manager.go:851] "Failed to get status for pod" podUID="690f916b-6f87-42d9-8168-392a9177bee9" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.581469 master-0 kubenswrapper[7110]: I0313 01:18:51.581418 7110 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.581788 master-0 kubenswrapper[7110]: I0313 01:18:51.581745 7110 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624" exitCode=0 Mar 13 01:18:51.581847 master-0 kubenswrapper[7110]: I0313 01:18:51.581814 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624"} Mar 13 01:18:51.582054 master-0 kubenswrapper[7110]: I0313 01:18:51.582028 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:51.582483 master-0 kubenswrapper[7110]: I0313 01:18:51.582123 7110 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.582545 master-0 kubenswrapper[7110]: I0313 01:18:51.582523 7110 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:18:51.584225 master-0 kubenswrapper[7110]: I0313 01:18:51.584029 7110 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.584804 master-0 kubenswrapper[7110]: I0313 01:18:51.584751 7110 status_manager.go:851] "Failed to get status for pod" podUID="690f916b-6f87-42d9-8168-392a9177bee9" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.585304 master-0 kubenswrapper[7110]: I0313 01:18:51.585264 7110 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:51.980973 master-0 kubenswrapper[7110]: I0313 01:18:51.980918 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:51.980973 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:51.980973 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:51.980973 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:51.981194 master-0 kubenswrapper[7110]: I0313 01:18:51.980986 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:52.017449 master-0 kubenswrapper[7110]: I0313 01:18:52.017329 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:52.019276 master-0 kubenswrapper[7110]: I0313 01:18:52.019103 7110 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:52.020519 master-0 kubenswrapper[7110]: I0313 01:18:52.020416 7110 status_manager.go:851] "Failed to get status for pod" podUID="690f916b-6f87-42d9-8168-392a9177bee9" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:52.021815 master-0 kubenswrapper[7110]: I0313 01:18:52.021617 7110 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:18:52.036804 master-0 kubenswrapper[7110]: I0313 01:18:52.036704 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:18:52.037490 master-0 kubenswrapper[7110]: I0313 01:18:52.037442 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:18:52.037681 master-0 kubenswrapper[7110]: I0313 01:18:52.037511 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:18:52.038298 master-0 kubenswrapper[7110]: I0313 01:18:52.038024 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.038298 master-0 kubenswrapper[7110]: I0313 01:18:52.038202 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock" (OuterVolumeSpecName: "var-lock") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.038491 master-0 kubenswrapper[7110]: I0313 01:18:52.038300 7110 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.041658 master-0 kubenswrapper[7110]: I0313 01:18:52.041579 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:18:52.140032 master-0 kubenswrapper[7110]: I0313 01:18:52.139989 7110 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.140429 master-0 kubenswrapper[7110]: I0313 01:18:52.140035 7110 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.454682 master-0 kubenswrapper[7110]: I0313 01:18:52.453722 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:18:52.547293 master-0 kubenswrapper[7110]: I0313 01:18:52.547192 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.547293 master-0 kubenswrapper[7110]: I0313 01:18:52.547279 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.547459 master-0 kubenswrapper[7110]: I0313 01:18:52.547388 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.547459 master-0 kubenswrapper[7110]: I0313 01:18:52.547413 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.547459 master-0 kubenswrapper[7110]: I0313 01:18:52.547457 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.547555 master-0 kubenswrapper[7110]: I0313 01:18:52.547505 7110 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 01:18:52.548120 master-0 kubenswrapper[7110]: I0313 01:18:52.548092 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.548181 master-0 kubenswrapper[7110]: I0313 01:18:52.548136 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.548181 master-0 kubenswrapper[7110]: I0313 01:18:52.548170 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.548236 master-0 kubenswrapper[7110]: I0313 01:18:52.548193 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.548236 master-0 kubenswrapper[7110]: I0313 01:18:52.548215 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.548297 master-0 kubenswrapper[7110]: I0313 01:18:52.548239 7110 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:18:52.599988 master-0 kubenswrapper[7110]: I0313 01:18:52.599902 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a"} Mar 13 01:18:52.599988 master-0 kubenswrapper[7110]: I0313 01:18:52.599995 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285"} Mar 13 01:18:52.600235 master-0 kubenswrapper[7110]: I0313 01:18:52.600012 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766"} Mar 13 01:18:52.601757 master-0 kubenswrapper[7110]: I0313 01:18:52.601713 7110 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172" exitCode=0 Mar 13 01:18:52.601848 master-0 kubenswrapper[7110]: I0313 01:18:52.601783 7110 scope.go:117] "RemoveContainer" containerID="2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60" Mar 13 01:18:52.601915 master-0 kubenswrapper[7110]: I0313 01:18:52.601894 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 01:18:52.610105 master-0 kubenswrapper[7110]: I0313 01:18:52.610057 7110 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:18:52.610521 master-0 kubenswrapper[7110]: I0313 01:18:52.610474 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerDied","Data":"e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9"} Mar 13 01:18:52.610521 master-0 kubenswrapper[7110]: I0313 01:18:52.610518 7110 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9" Mar 13 01:18:52.628184 master-0 kubenswrapper[7110]: I0313 01:18:52.628141 7110 scope.go:117] "RemoveContainer" containerID="0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172" Mar 13 01:18:52.647928 master-0 kubenswrapper[7110]: I0313 01:18:52.647876 7110 scope.go:117] "RemoveContainer" containerID="7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186" Mar 13 01:18:52.650808 master-0 kubenswrapper[7110]: I0313 01:18:52.650781 7110 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.650808 master-0 kubenswrapper[7110]: I0313 01:18:52.650807 7110 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.650941 master-0 kubenswrapper[7110]: I0313 01:18:52.650820 7110 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.650941 master-0 kubenswrapper[7110]: I0313 01:18:52.650833 7110 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.650941 master-0 kubenswrapper[7110]: I0313 01:18:52.650846 7110 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.650941 master-0 kubenswrapper[7110]: I0313 01:18:52.650858 7110 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.665950 7110 scope.go:117] "RemoveContainer" containerID="2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: E0313 01:18:52.666485 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60\": container with ID starting with 2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60 not found: ID does not exist" containerID="2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.666533 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60"} err="failed to get container status \"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60\": rpc error: code = NotFound desc = could not find container \"2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60\": container with ID starting with 2d7298385249f13020b6f4aba996517ed9ff1e2913b22038f141e521d9ed6c60 not found: ID does not exist" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.666571 7110 scope.go:117] "RemoveContainer" containerID="0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: E0313 01:18:52.667133 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172\": container with ID starting with 0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172 not found: ID does not exist" containerID="0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.667172 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172"} err="failed to get container status \"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172\": rpc error: code = NotFound desc = could not find container \"0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172\": container with ID starting with 0445b88e121211b2f5fee8d8974ce092dbbbfd137fd485f9bdd95d5e9a6bd172 not found: ID does not exist" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.667199 7110 scope.go:117] "RemoveContainer" containerID="7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: E0313 01:18:52.667482 7110 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186\": container with ID starting with 7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186 not found: ID does not exist" containerID="7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186" Mar 13 01:18:52.679917 master-0 kubenswrapper[7110]: I0313 01:18:52.667515 7110 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186"} err="failed to get container status \"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186\": rpc error: code = NotFound desc = could not find container \"7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186\": container with ID starting with 7af43aa64c145f54e06604757091c85cae4a917cad1277500eafd26947523186 not found: ID does not exist" Mar 13 01:18:52.917691 master-0 kubenswrapper[7110]: I0313 01:18:52.917250 7110 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 13 01:18:52.917691 master-0 kubenswrapper[7110]: I0313 01:18:52.917672 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 13 01:18:52.980116 master-0 kubenswrapper[7110]: I0313 01:18:52.980064 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:52.980116 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:52.980116 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:52.980116 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:52.980425 master-0 kubenswrapper[7110]: I0313 01:18:52.980122 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:53.992678 master-0 kubenswrapper[7110]: I0313 01:18:53.982762 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:53.992678 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:53.992678 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:53.992678 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:53.992678 master-0 kubenswrapper[7110]: I0313 01:18:53.982816 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:54.980028 master-0 kubenswrapper[7110]: I0313 01:18:54.979984 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:54.980028 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:54.980028 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:54.980028 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:54.980297 master-0 kubenswrapper[7110]: I0313 01:18:54.980039 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:55.980565 master-0 kubenswrapper[7110]: I0313 01:18:55.980498 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:55.980565 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:55.980565 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:55.980565 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:55.981236 master-0 kubenswrapper[7110]: I0313 01:18:55.980596 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:56.981122 master-0 kubenswrapper[7110]: I0313 01:18:56.981034 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:56.981122 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:56.981122 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:56.981122 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:56.982068 master-0 kubenswrapper[7110]: I0313 01:18:56.981123 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:57.980929 master-0 kubenswrapper[7110]: I0313 01:18:57.980850 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:57.980929 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:57.980929 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:57.980929 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:57.981939 master-0 kubenswrapper[7110]: I0313 01:18:57.980956 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:58.720349 master-0 kubenswrapper[7110]: E0313 01:18:58.720301 7110 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="5.814s" Mar 13 01:18:58.720349 master-0 kubenswrapper[7110]: I0313 01:18:58.720358 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:58.720588 master-0 kubenswrapper[7110]: I0313 01:18:58.720431 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d"} Mar 13 01:18:58.736986 master-0 kubenswrapper[7110]: I0313 01:18:58.734268 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 13 01:18:58.740891 master-0 kubenswrapper[7110]: I0313 01:18:58.740702 7110 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a"} Mar 13 01:18:58.740891 master-0 kubenswrapper[7110]: I0313 01:18:58.740847 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:58.740891 master-0 kubenswrapper[7110]: I0313 01:18:58.740878 7110 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:58.741152 master-0 kubenswrapper[7110]: I0313 01:18:58.740931 7110 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:18:58.818107 master-0 kubenswrapper[7110]: W0313 01:18:58.818044 7110 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db888f0_51b6_43cf_8337_69d2d5cc2b0a.slice/crio-2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff WatchSource:0}: Error finding container 2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff: Status 404 returned error can't find the container with id 2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff Mar 13 01:18:58.919660 master-0 kubenswrapper[7110]: I0313 01:18:58.918624 7110 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 13 01:18:58.984219 master-0 kubenswrapper[7110]: I0313 01:18:58.984084 7110 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-cnrhm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 01:18:58.984219 master-0 kubenswrapper[7110]: [-]has-synced failed: reason withheld Mar 13 01:18:58.984219 master-0 kubenswrapper[7110]: [+]process-running ok Mar 13 01:18:58.984219 master-0 kubenswrapper[7110]: healthz check failed Mar 13 01:18:58.984219 master-0 kubenswrapper[7110]: I0313 01:18:58.984164 7110 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" podUID="0671fdd0-b358-40f9-ae49-2c5a9004edb3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 01:18:59.096898 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 13 01:18:59.115473 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 01:18:59.115725 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 13 01:18:59.116565 master-0 systemd[1]: kubelet.service: Consumed 47.869s CPU time. Mar 13 01:18:59.132546 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 01:18:59.230910 master-0 kubenswrapper[19170]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 01:18:59.231542 master-0 kubenswrapper[19170]: I0313 01:18:59.230996 19170 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233814 19170 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233833 19170 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233839 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233844 19170 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233849 19170 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233853 19170 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233859 19170 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233863 19170 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233867 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233872 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233876 19170 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233881 19170 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233885 19170 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233921 19170 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233928 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233932 19170 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233946 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233951 19170 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233956 19170 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:18:59.233998 master-0 kubenswrapper[19170]: W0313 01:18:59.233986 19170 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.233992 19170 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.233997 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234002 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234007 19170 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234014 19170 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234020 19170 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234026 19170 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234032 19170 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234038 19170 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234045 19170 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234052 19170 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234057 19170 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234062 19170 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234067 19170 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234072 19170 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234077 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234082 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234087 19170 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234091 19170 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:18:59.234574 master-0 kubenswrapper[19170]: W0313 01:18:59.234096 19170 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234101 19170 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234107 19170 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234111 19170 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234116 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234121 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234126 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234133 19170 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234139 19170 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234153 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234159 19170 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234164 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234169 19170 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234175 19170 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234181 19170 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234239 19170 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234247 19170 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234253 19170 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234257 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:18:59.235208 master-0 kubenswrapper[19170]: W0313 01:18:59.234265 19170 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234270 19170 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234275 19170 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234279 19170 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234285 19170 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234291 19170 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234296 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234300 19170 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234305 19170 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234309 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234314 19170 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234318 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234323 19170 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: W0313 01:18:59.234328 19170 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234428 19170 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234440 19170 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234453 19170 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234460 19170 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234467 19170 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234472 19170 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234479 19170 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 01:18:59.236089 master-0 kubenswrapper[19170]: I0313 01:18:59.234486 19170 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234491 19170 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234496 19170 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234502 19170 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234508 19170 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234524 19170 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234531 19170 flags.go:64] FLAG: --cgroup-root="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234536 19170 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234541 19170 flags.go:64] FLAG: --client-ca-file="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234547 19170 flags.go:64] FLAG: --cloud-config="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234552 19170 flags.go:64] FLAG: --cloud-provider="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234556 19170 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234570 19170 flags.go:64] FLAG: --cluster-domain="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234575 19170 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234581 19170 flags.go:64] FLAG: --config-dir="" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234586 19170 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234592 19170 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234600 19170 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234605 19170 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234611 19170 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234616 19170 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234621 19170 flags.go:64] FLAG: --contention-profiling="false" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234646 19170 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234652 19170 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234658 19170 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 01:18:59.236694 master-0 kubenswrapper[19170]: I0313 01:18:59.234663 19170 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234670 19170 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234675 19170 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234680 19170 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234685 19170 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234690 19170 flags.go:64] FLAG: --enable-server="true" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234696 19170 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234708 19170 flags.go:64] FLAG: --event-burst="100" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234714 19170 flags.go:64] FLAG: --event-qps="50" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234719 19170 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234724 19170 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234729 19170 flags.go:64] FLAG: --eviction-hard="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234736 19170 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234741 19170 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234746 19170 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234751 19170 flags.go:64] FLAG: --eviction-soft="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234762 19170 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234768 19170 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234773 19170 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234778 19170 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234783 19170 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234788 19170 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234792 19170 flags.go:64] FLAG: --feature-gates="" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234799 19170 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234804 19170 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 01:18:59.237339 master-0 kubenswrapper[19170]: I0313 01:18:59.234810 19170 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234815 19170 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234821 19170 flags.go:64] FLAG: --healthz-port="10248" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234826 19170 flags.go:64] FLAG: --help="false" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234831 19170 flags.go:64] FLAG: --hostname-override="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234837 19170 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234842 19170 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234850 19170 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234854 19170 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234858 19170 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234862 19170 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234866 19170 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234870 19170 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234874 19170 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234878 19170 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234883 19170 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234887 19170 flags.go:64] FLAG: --kube-reserved="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234891 19170 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234895 19170 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234899 19170 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234903 19170 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234907 19170 flags.go:64] FLAG: --lock-file="" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234911 19170 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234915 19170 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234919 19170 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234925 19170 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 01:18:59.237998 master-0 kubenswrapper[19170]: I0313 01:18:59.234929 19170 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234940 19170 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234944 19170 flags.go:64] FLAG: --logging-format="text" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234948 19170 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234953 19170 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234957 19170 flags.go:64] FLAG: --manifest-url="" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234961 19170 flags.go:64] FLAG: --manifest-url-header="" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234966 19170 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234971 19170 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234978 19170 flags.go:64] FLAG: --max-pods="110" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.234983 19170 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235031 19170 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235092 19170 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235104 19170 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235109 19170 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235114 19170 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235119 19170 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235130 19170 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235135 19170 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235147 19170 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235153 19170 flags.go:64] FLAG: --pod-cidr="" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235158 19170 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235167 19170 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 01:18:59.238720 master-0 kubenswrapper[19170]: I0313 01:18:59.235171 19170 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235177 19170 flags.go:64] FLAG: --pods-per-core="0" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235181 19170 flags.go:64] FLAG: --port="10250" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235187 19170 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235192 19170 flags.go:64] FLAG: --provider-id="" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235197 19170 flags.go:64] FLAG: --qos-reserved="" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235202 19170 flags.go:64] FLAG: --read-only-port="10255" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235206 19170 flags.go:64] FLAG: --register-node="true" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235211 19170 flags.go:64] FLAG: --register-schedulable="true" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235216 19170 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235223 19170 flags.go:64] FLAG: --registry-burst="10" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235227 19170 flags.go:64] FLAG: --registry-qps="5" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235231 19170 flags.go:64] FLAG: --reserved-cpus="" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235236 19170 flags.go:64] FLAG: --reserved-memory="" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235249 19170 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235253 19170 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235257 19170 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235261 19170 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235266 19170 flags.go:64] FLAG: --runonce="false" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235269 19170 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235273 19170 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235278 19170 flags.go:64] FLAG: --seccomp-default="false" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235282 19170 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235297 19170 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235301 19170 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235306 19170 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 01:18:59.239261 master-0 kubenswrapper[19170]: I0313 01:18:59.235310 19170 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235314 19170 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235318 19170 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235322 19170 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235326 19170 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235330 19170 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235334 19170 flags.go:64] FLAG: --system-cgroups="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235338 19170 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235345 19170 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235349 19170 flags.go:64] FLAG: --tls-cert-file="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235353 19170 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235557 19170 flags.go:64] FLAG: --tls-min-version="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235562 19170 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235568 19170 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235572 19170 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235576 19170 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235580 19170 flags.go:64] FLAG: --v="2" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235591 19170 flags.go:64] FLAG: --version="false" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235597 19170 flags.go:64] FLAG: --vmodule="" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235602 19170 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: I0313 01:18:59.235607 19170 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: W0313 01:18:59.235750 19170 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: W0313 01:18:59.235757 19170 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: W0313 01:18:59.235762 19170 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:18:59.239912 master-0 kubenswrapper[19170]: W0313 01:18:59.235773 19170 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235778 19170 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235782 19170 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235786 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235790 19170 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235796 19170 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235799 19170 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235803 19170 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235808 19170 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235812 19170 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235817 19170 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235821 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235825 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235829 19170 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235833 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235836 19170 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235840 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235843 19170 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235847 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:18:59.240467 master-0 kubenswrapper[19170]: W0313 01:18:59.235850 19170 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235854 19170 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235857 19170 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235860 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235864 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235867 19170 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235871 19170 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235874 19170 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235878 19170 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235881 19170 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235885 19170 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235888 19170 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235892 19170 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235895 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235899 19170 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235902 19170 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235906 19170 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235915 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235923 19170 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235926 19170 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:18:59.241009 master-0 kubenswrapper[19170]: W0313 01:18:59.235930 19170 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235933 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235937 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235940 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235943 19170 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235947 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235950 19170 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235953 19170 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235957 19170 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235960 19170 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235964 19170 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235968 19170 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235972 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235976 19170 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235980 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235983 19170 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235987 19170 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235990 19170 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.235993 19170 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.236000 19170 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:18:59.241528 master-0 kubenswrapper[19170]: W0313 01:18:59.236003 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236006 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236010 19170 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236013 19170 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236017 19170 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236020 19170 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236024 19170 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236027 19170 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236031 19170 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: W0313 01:18:59.236034 19170 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:18:59.242105 master-0 kubenswrapper[19170]: I0313 01:18:59.236049 19170 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:18:59.245973 master-0 kubenswrapper[19170]: I0313 01:18:59.245922 19170 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 01:18:59.245973 master-0 kubenswrapper[19170]: I0313 01:18:59.245960 19170 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246040 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246050 19170 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246056 19170 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246063 19170 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246068 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246072 19170 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246075 19170 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246079 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246082 19170 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246086 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246089 19170 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246093 19170 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246096 19170 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246100 19170 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246104 19170 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246108 19170 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246113 19170 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246117 19170 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246122 19170 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:18:59.246104 master-0 kubenswrapper[19170]: W0313 01:18:59.246127 19170 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246131 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246136 19170 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246140 19170 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246144 19170 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246148 19170 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246153 19170 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246157 19170 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246161 19170 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246164 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246168 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246172 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246176 19170 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246179 19170 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246183 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246188 19170 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246194 19170 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246198 19170 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246203 19170 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246206 19170 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:18:59.246765 master-0 kubenswrapper[19170]: W0313 01:18:59.246212 19170 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246216 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246221 19170 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246225 19170 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246229 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246234 19170 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246238 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246242 19170 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246246 19170 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246249 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246253 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246257 19170 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246261 19170 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246265 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246269 19170 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246272 19170 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246276 19170 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246280 19170 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246283 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246287 19170 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:18:59.247226 master-0 kubenswrapper[19170]: W0313 01:18:59.246291 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246294 19170 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246299 19170 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246303 19170 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246307 19170 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246311 19170 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246314 19170 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246318 19170 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246322 19170 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246326 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246329 19170 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246333 19170 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246339 19170 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: I0313 01:18:59.246347 19170 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:18:59.247771 master-0 kubenswrapper[19170]: W0313 01:18:59.246501 19170 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246510 19170 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246516 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246520 19170 feature_gate.go:330] unrecognized feature gate: Example Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246525 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246529 19170 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246533 19170 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246537 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246542 19170 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246546 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246550 19170 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246554 19170 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246558 19170 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246562 19170 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246567 19170 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246571 19170 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246575 19170 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246579 19170 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246583 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 01:18:59.248119 master-0 kubenswrapper[19170]: W0313 01:18:59.246587 19170 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246591 19170 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246595 19170 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246599 19170 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246604 19170 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246608 19170 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246611 19170 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246615 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246619 19170 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246623 19170 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246672 19170 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246677 19170 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246682 19170 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246687 19170 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246692 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246697 19170 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246701 19170 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246705 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246709 19170 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246713 19170 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 01:18:59.248615 master-0 kubenswrapper[19170]: W0313 01:18:59.246717 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246721 19170 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246724 19170 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246729 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246733 19170 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246737 19170 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246741 19170 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246745 19170 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246749 19170 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246752 19170 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246756 19170 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246760 19170 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246764 19170 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246768 19170 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246772 19170 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246776 19170 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246781 19170 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246786 19170 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246790 19170 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 01:18:59.249388 master-0 kubenswrapper[19170]: W0313 01:18:59.246794 19170 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246798 19170 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246802 19170 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246806 19170 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246810 19170 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246813 19170 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246817 19170 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246822 19170 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246827 19170 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246831 19170 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246834 19170 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246839 19170 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246843 19170 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: W0313 01:18:59.246847 19170 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: I0313 01:18:59.246853 19170 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 01:18:59.250281 master-0 kubenswrapper[19170]: I0313 01:18:59.247050 19170 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.248687 19170 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.248758 19170 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.248977 19170 server.go:997] "Starting client certificate rotation" Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.248987 19170 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.249571 19170 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.249447 19170 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 22:03:24.833169417 +0000 UTC Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.249684 19170 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h44m25.583489996s for next certificate rotation Mar 13 01:18:59.251085 master-0 kubenswrapper[19170]: I0313 01:18:59.250926 19170 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 01:18:59.253371 master-0 kubenswrapper[19170]: I0313 01:18:59.253351 19170 log.go:25] "Validated CRI v1 runtime API" Mar 13 01:18:59.263689 master-0 kubenswrapper[19170]: I0313 01:18:59.263484 19170 log.go:25] "Validated CRI v1 image API" Mar 13 01:18:59.269042 master-0 kubenswrapper[19170]: I0313 01:18:59.268998 19170 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 01:18:59.281103 master-0 kubenswrapper[19170]: I0313 01:18:59.280123 19170 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 93b4eca9-1357-4499-ad5f-ae90bf0d6f4a:/dev/vda3] Mar 13 01:18:59.283105 master-0 kubenswrapper[19170]: I0313 01:18:59.280178 19170 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0072057986d1a9c35e19db3f7ab2650e875b4c3fecae35f046b875511fe06154/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0072057986d1a9c35e19db3f7ab2650e875b4c3fecae35f046b875511fe06154/userdata/shm major:0 minor:60 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8/userdata/shm major:0 minor:708 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d/userdata/shm major:0 minor:680 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19ce38027ae9e3d0076b8c83191fabde1e4e81b393c760835578ba3bc36b41b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19ce38027ae9e3d0076b8c83191fabde1e4e81b393c760835578ba3bc36b41b2/userdata/shm major:0 minor:1046 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8/userdata/shm major:0 minor:1116 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff/userdata/shm major:0 minor:97 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ea2257b817f7a593cf8a5bc18fd54c7de892a301e19617876be4cc31d01237b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ea2257b817f7a593cf8a5bc18fd54c7de892a301e19617876be4cc31d01237b/userdata/shm major:0 minor:331 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/339cc6449a0020231eef0158a934d4ae19f59a10f226d56a246c3dc49a8eebbe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/339cc6449a0020231eef0158a934d4ae19f59a10f226d56a246c3dc49a8eebbe/userdata/shm major:0 minor:705 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/34d6500d42674d3ac28ad1da03d31ad6fc07a588196014c4a73a86965dd9deb9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/34d6500d42674d3ac28ad1da03d31ad6fc07a588196014c4a73a86965dd9deb9/userdata/shm major:0 minor:798 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e/userdata/shm major:0 minor:679 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86/userdata/shm major:0 minor:477 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3ac90b7e141885c73870d9744a9126cb8648da1eed1822b13844b812ecb6dc82/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3ac90b7e141885c73870d9744a9126cb8648da1eed1822b13844b812ecb6dc82/userdata/shm major:0 minor:948 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3/userdata/shm major:0 minor:964 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3f750f4eaadd11866936791933f7a3cbf786b838bf1e7a9f9142487b42787b0b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3f750f4eaadd11866936791933f7a3cbf786b838bf1e7a9f9142487b42787b0b/userdata/shm major:0 minor:473 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334/userdata/shm major:0 minor:840 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c/userdata/shm major:0 minor:991 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/50afa0bafdfdadd430cb50b2aa81b0c11200da9c802e7cb966b1902e4941db5a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/50afa0bafdfdadd430cb50b2aa81b0c11200da9c802e7cb966b1902e4941db5a/userdata/shm major:0 minor:44 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822/userdata/shm major:0 minor:713 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9/userdata/shm major:0 minor:505 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5f0a23a29ec1be227442f950c7b43af141e31a2152ab46cc286a5229950b1bae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5f0a23a29ec1be227442f950c7b43af141e31a2152ab46cc286a5229950b1bae/userdata/shm major:0 minor:1031 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134/userdata/shm major:0 minor:688 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79/userdata/shm major:0 minor:475 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm major:0 minor:289 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6f8519eea623420c40da808c6cfff53da6452162ecb364a1c82aa4dfe3545fe2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6f8519eea623420c40da808c6cfff53da6452162ecb364a1c82aa4dfe3545fe2/userdata/shm major:0 minor:603 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679/userdata/shm major:0 minor:1048 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm major:0 minor:241 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1/userdata/shm major:0 minor:1112 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm major:0 minor:99 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284/userdata/shm major:0 minor:1036 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/92c3b2f339c88995c70507230cfa25808d3c4399c710b2454c51839f6048ccf5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/92c3b2f339c88995c70507230cfa25808d3c4399c710b2454c51839f6048ccf5/userdata/shm major:0 minor:704 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/94e09a1dd75367c606e9c0f6209f6e945683271c1483d15ae30d37382e33a6c7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/94e09a1dd75367c606e9c0f6209f6e945683271c1483d15ae30d37382e33a6c7/userdata/shm major:0 minor:848 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9abb90df1fb36f7d743ddb849ea400a46f15eae6ffadde3a44f5e1ad0528227b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9abb90df1fb36f7d743ddb849ea400a46f15eae6ffadde3a44f5e1ad0528227b/userdata/shm major:0 minor:69 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da/userdata/shm major:0 minor:68 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973/userdata/shm major:0 minor:371 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a628e92ac4f34b60f238b76d4fc08c8cab73f3dfd7d9d1150c95d95292472f21/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a628e92ac4f34b60f238b76d4fc08c8cab73f3dfd7d9d1150c95d95292472f21/userdata/shm major:0 minor:710 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877/userdata/shm major:0 minor:440 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b018527cc19e60b658984a3b2cf8d02fa83e221b23e0763c86d4b53c72e80c7e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b018527cc19e60b658984a3b2cf8d02fa83e221b23e0763c86d4b53c72e80c7e/userdata/shm major:0 minor:619 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272/userdata/shm major:0 minor:453 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b26809d00df2d88f0387eef7498f3d90150a196ebaa102f4f43bf51209c487a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b26809d00df2d88f0387eef7498f3d90150a196ebaa102f4f43bf51209c487a9/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c/userdata/shm major:0 minor:805 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm major:0 minor:301 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b656a6f1c70c3edb8a88d273e10ec19afe3e617046ee184903275fabe65867b3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b656a6f1c70c3edb8a88d273e10ec19afe3e617046ee184903275fabe65867b3/userdata/shm major:0 minor:714 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c/userdata/shm major:0 minor:715 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b70fd8156b9269ea1d32e5bd6b505f43cc5c2cda9055f9eab294a1ae160205e2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b70fd8156b9269ea1d32e5bd6b505f43cc5c2cda9055f9eab294a1ae160205e2/userdata/shm major:0 minor:684 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820/userdata/shm major:0 minor:788 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bdd5af34bfad236139e626fcebfb16719c123b2551b988ca1c04bcedf0b2fdb1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bdd5af34bfad236139e626fcebfb16719c123b2551b988ca1c04bcedf0b2fdb1/userdata/shm major:0 minor:506 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82/userdata/shm major:0 minor:1082 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36/userdata/shm major:0 minor:950 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac/userdata/shm major:0 minor:842 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d0d7ba4bdbd45759b508d00f36d1e06281f843bb6e1de6ed64932952a8078e77/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d0d7ba4bdbd45759b508d00f36d1e06281f843bb6e1de6ed64932952a8078e77/userdata/shm major:0 minor:414 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d87d8f00b3827a6cc0d679f67563557686bd72d154906a3035b8f36d3110e48e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d87d8f00b3827a6cc0d679f67563557686bd72d154906a3035b8f36d3110e48e/userdata/shm major:0 minor:706 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498/userdata/shm major:0 minor:1065 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96/userdata/shm major:0 minor:479 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3/userdata/shm major:0 minor:84 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e/userdata/shm major:0 minor:476 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/eeedfbb568950a2005b49c940a6eb5e45d4af2d8ddb401839d8110cff9f9ae07/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/eeedfbb568950a2005b49c940a6eb5e45d4af2d8ddb401839d8110cff9f9ae07/userdata/shm major:0 minor:472 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm major:0 minor:130 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8dd90e4b919a4750dacd366cb8ce8129d02c4f3f75302771450ca85e994151e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8dd90e4b919a4750dacd366cb8ce8129d02c4f3f75302771450ca85e994151e/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3/userdata/shm major:0 minor:672 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm:{mountpoint:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~projected/kube-api-access-pwxhc:{mountpoint:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~projected/kube-api-access-pwxhc major:0 minor:1018 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1005 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1010 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~projected/kube-api-access-ftn5x:{mountpoint:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~projected/kube-api-access-ftn5x major:0 minor:1035 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/default-certificate major:0 minor:1030 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1026 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/stats-auth major:0 minor:1002 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7:{mountpoint:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7 major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:698 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb:{mountpoint:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv:{mountpoint:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv major:0 minor:262 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~secret/metrics-tls major:0 minor:469 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~projected/kube-api-access-dqsdm:{mountpoint:/var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~projected/kube-api-access-dqsdm major:0 minor:947 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:943 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~projected/kube-api-access-4n9fb:{mountpoint:/var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~projected/kube-api-access-4n9fb major:0 minor:934 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~secret/webhook-certs major:0 minor:400 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj:{mountpoint:/var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj major:0 minor:110 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/ca-certs major:0 minor:496 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/kube-api-access-g44dw:{mountpoint:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/kube-api-access-g44dw major:0 minor:497 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:602 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~projected/kube-api-access-d6frm:{mountpoint:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~projected/kube-api-access-d6frm major:0 minor:869 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:856 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/webhook-cert major:0 minor:861 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/ca-certs major:0 minor:499 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/kube-api-access-6v79j:{mountpoint:/var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/kube-api-access-6v79j major:0 minor:498 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~projected/kube-api-access-jkjrm:{mountpoint:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~projected/kube-api-access-jkjrm major:0 minor:1028 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1006 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1007 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~projected/kube-api-access-898lt:{mountpoint:/var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~projected/kube-api-access-898lt major:0 minor:958 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:956 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d2e7338-a6d6-4872-ab72-a4e631075ab3/volumes/kubernetes.io~projected/kube-api-access-vkgvg:{mountpoint:/var/lib/kubelet/pods/3d2e7338-a6d6-4872-ab72-a4e631075ab3/volumes/kubernetes.io~projected/kube-api-access-vkgvg major:0 minor:421 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f9728b4-4e1e-4165-a276-3daa00e95839/volumes/kubernetes.io~projected/kube-api-access-xr6vn:{mountpoint:/var/lib/kubelet/pods/3f9728b4-4e1e-4165-a276-3daa00e95839/volumes/kubernetes.io~projected/kube-api-access-xr6vn major:0 minor:500 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m:{mountpoint:/var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69:{mountpoint:/var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69 major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/48375ae2-d4b4-4db4-b832-3e3db1834fb9/volumes/kubernetes.io~projected/kube-api-access-q7mn4:{mountpoint:/var/lib/kubelet/pods/48375ae2-d4b4-4db4-b832-3e3db1834fb9/volumes/kubernetes.io~projected/kube-api-access-q7mn4 major:0 minor:1032 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv:{mountpoint:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48:{mountpoint:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48 major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~secret/proxy-tls major:0 minor:700 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5:{mountpoint:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:470 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:471 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1029 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv major:0 minor:131 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:128 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~projected/kube-api-access-kd849:{mountpoint:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~projected/kube-api-access-kd849 major:0 minor:504 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/encryption-config major:0 minor:503 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/etcd-client major:0 minor:501 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/serving-cert major:0 minor:502 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~projected/kube-api-access-6qlks:{mountpoint:/var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~projected/kube-api-access-6qlks major:0 minor:588 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~secret/metrics-tls major:0 minor:452 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s:{mountpoint:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~projected/kube-api-access-jvn9n:{mountpoint:/var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~projected/kube-api-access-jvn9n major:0 minor:395 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~secret/serving-cert major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64477504-5cb6-42dc-a7eb-662981daec4a/volumes/kubernetes.io~projected/kube-api-access-gg77t:{mountpoint:/var/lib/kubelet/pods/64477504-5cb6-42dc-a7eb-662981daec4a/volumes/kubernetes.io~projected/kube-api-access-gg77t major:0 minor:366 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:572 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/tmp major:0 minor:573 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~projected/kube-api-access-lpvtc:{mountpoint:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~projected/kube-api-access-lpvtc major:0 minor:575 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5:{mountpoint:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5 major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~secret/srv-cert major:0 minor:702 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76:{mountpoint:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76 major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~secret/metrics-tls major:0 minor:465 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~projected/kube-api-access-tl6k6:{mountpoint:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~projected/kube-api-access-tl6k6 major:0 minor:613 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/encryption-config major:0 minor:610 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/etcd-client major:0 minor:611 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/serving-cert major:0 minor:609 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~projected/kube-api-access-vtqjr:{mountpoint:/var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~projected/kube-api-access-vtqjr major:0 minor:872 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~secret/serving-cert major:0 minor:862 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn:{mountpoint:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:701 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9:{mountpoint:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9 major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:693 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92:{mountpoint:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92 major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~secret/srv-cert major:0 minor:699 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~projected/kube-api-access major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~secret/serving-cert major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58:{mountpoint:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58 major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cert major:0 minor:464 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:467 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~projected/kube-api-access-6hhwp:{mountpoint:/var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~projected/kube-api-access-6hhwp major:0 minor:799 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:623 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/916d9fc9-388b-4 Mar 13 01:18:59.283670 master-0 kubenswrapper[19170]: 506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6:{mountpoint:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~projected/kube-api-access-s9nfk:{mountpoint:/var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~projected/kube-api-access-s9nfk major:0 minor:809 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~secret/proxy-tls major:0 minor:791 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll:{mountpoint:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~projected/kube-api-access-8gj56:{mountpoint:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~projected/kube-api-access-8gj56 major:0 minor:96 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:93 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:95 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:89 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b/volumes/kubernetes.io~projected/kube-api-access-7gmkr:{mountpoint:/var/lib/kubelet/pods/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b/volumes/kubernetes.io~projected/kube-api-access-7gmkr major:0 minor:835 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~projected/kube-api-access-sddd9:{mountpoint:/var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~projected/kube-api-access-sddd9 major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~secret/serving-cert major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22/volumes/kubernetes.io~projected/kube-api-access-wwnml:{mountpoint:/var/lib/kubelet/pods/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22/volumes/kubernetes.io~projected/kube-api-access-wwnml major:0 minor:838 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac2a4c90-32db-4464-8c47-acbcafbcd5d0/volumes/kubernetes.io~projected/kube-api-access-sw5hs:{mountpoint:/var/lib/kubelet/pods/ac2a4c90-32db-4464-8c47-acbcafbcd5d0/volumes/kubernetes.io~projected/kube-api-access-sw5hs major:0 minor:1034 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz:{mountpoint:/var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz major:0 minor:270 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~projected/kube-api-access-42sqh:{mountpoint:/var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~projected/kube-api-access-42sqh major:0 minor:681 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:445 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn:{mountpoint:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w:{mountpoint:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~projected/kube-api-access-929r9:{mountpoint:/var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~projected/kube-api-access-929r9 major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~projected/kube-api-access-gx8zl:{mountpoint:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~projected/kube-api-access-gx8zl major:0 minor:1027 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1011 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst:{mountpoint:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d23bbaec-b635-4649-b26e-2829f32d21f0/volumes/kubernetes.io~projected/kube-api-access-szx9m:{mountpoint:/var/lib/kubelet/pods/d23bbaec-b635-4649-b26e-2829f32d21f0/volumes/kubernetes.io~projected/kube-api-access-szx9m major:0 minor:873 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~projected/kube-api-access-7f82n:{mountpoint:/var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~projected/kube-api-access-7f82n major:0 minor:875 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~secret/cert major:0 minor:874 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d5456c8b-3c98-4824-8700-a04e9c12fb2e/volumes/kubernetes.io~projected/kube-api-access-mnxgm:{mountpoint:/var/lib/kubelet/pods/d5456c8b-3c98-4824-8700-a04e9c12fb2e/volumes/kubernetes.io~projected/kube-api-access-mnxgm major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf:{mountpoint:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d81bcb58-efe3-4577-8e88-67f92c645f6f/volumes/kubernetes.io~projected/kube-api-access-k7wj9:{mountpoint:/var/lib/kubelet/pods/d81bcb58-efe3-4577-8e88-67f92c645f6f/volumes/kubernetes.io~projected/kube-api-access-k7wj9 major:0 minor:589 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~projected/kube-api-access-n4rfg:{mountpoint:/var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~projected/kube-api-access-n4rfg major:0 minor:413 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~secret/signing-key major:0 minor:412 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~projected/kube-api-access-rtrb2:{mountpoint:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~projected/kube-api-access-rtrb2 major:0 minor:1080 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1074 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1077 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~projected/kube-api-access-jt79p:{mountpoint:/var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~projected/kube-api-access-jt79p major:0 minor:685 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~secret/proxy-tls major:0 minor:678 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj:{mountpoint:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~secret/metrics-certs major:0 minor:703 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~projected/kube-api-access-ltxpc:{mountpoint:/var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~projected/kube-api-access-ltxpc major:0 minor:815 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:813 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~projected/kube-api-access-h5gmv:{mountpoint:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~projected/kube-api-access-h5gmv major:0 minor:1063 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/certs major:0 minor:1057 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1062 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz:{mountpoint:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:468 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg:{mountpoint:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} overlay_0-1001:{mountpoint:/var/lib/containers/storage/overlay/facb4ccee0a94187f2fb16e4142836ebe8e90e01eead07674bf4b7da31e067a9/merged major:0 minor:1001 fsType:overlay blockSize:0} overlay_0-1003:{mountpoint:/var/lib/containers/storage/overlay/3d1468b72fe055ca6011da2baeb2af44e40403c4ec1c0aa3b840d31bb3ae7b71/merged major:0 minor:1003 fsType:overlay blockSize:0} overlay_0-1008:{mountpoint:/var/lib/containers/storage/overlay/2b8d9bdd2321c965d55d9bbb1e77a36d438e3a85d6bac10f1362424d09432d26/merged major:0 minor:1008 fsType:overlay blockSize:0} overlay_0-101:{mountpoint:/var/lib/containers/storage/overlay/1068a8316a13c7cdeaaef3545bd8cc14495d985d39d355ee01c171279cd771c1/merged major:0 minor:101 fsType:overlay blockSize:0} overlay_0-1016:{mountpoint:/var/lib/containers/storage/overlay/b73fff5f09374fa531038e7a6919a595f1a849336958a208b897fd6d4488398f/merged major:0 minor:1016 fsType:overlay blockSize:0} overlay_0-103:{mountpoint:/var/lib/containers/storage/overlay/499bfb3a268ed2ae8c38719a6a8a04a9d10a53a80540940251845c54aadba54e/merged major:0 minor:103 fsType:overlay blockSize:0} overlay_0-1038:{mountpoint:/var/lib/containers/storage/overlay/70469e852110066730d7bbabb6be5b6b058d23eee2d895226e7b7b8946a25be3/merged major:0 minor:1038 fsType:overlay blockSize:0} overlay_0-1040:{mountpoint:/var/lib/containers/storage/overlay/a8ef99ba5a00a3fd588bd3fa78925041b54d4a2c0d4841fbe1bb54920278451b/merged major:0 minor:1040 fsType:overlay blockSize:0} overlay_0-1042:{mountpoint:/var/lib/containers/storage/overlay/60a7c8212f53eb2520cf9a64066bd4aa1c4137f7de0e4d9b1c7d5afc444a686b/merged major:0 minor:1042 fsType:overlay blockSize:0} overlay_0-1050:{mountpoint:/var/lib/containers/storage/overlay/979724bd52b75657c984a34f139474d2a56f8e89c72c17681cfcf98d3ee3ef38/merged major:0 minor:1050 fsType:overlay blockSize:0} overlay_0-1053:{mountpoint:/var/lib/containers/storage/overlay/73c24d691a06659d97877c1c5efdbf3f34223a5bf5aa084c6aedf86e28c8ca0f/merged major:0 minor:1053 fsType:overlay blockSize:0} overlay_0-1055:{mountpoint:/var/lib/containers/storage/overlay/4cfb4e0d10c69a22fa345b133431591c9df4159be90551b89e920cef32c0eb4b/merged major:0 minor:1055 fsType:overlay blockSize:0} overlay_0-1064:{mountpoint:/var/lib/containers/storage/overlay/86e77824867634e0d7aed1bdc68779beb9b788c27b022e6d96b07937224cfa0c/merged major:0 minor:1064 fsType:overlay blockSize:0} overlay_0-1066:{mountpoint:/var/lib/containers/storage/overlay/494f4009aaf905f15e7c7f6deab57cc5f26fc0c838e88429501edda58ec01f85/merged major:0 minor:1066 fsType:overlay blockSize:0} overlay_0-1070:{mountpoint:/var/lib/containers/storage/overlay/bff72e2c2a504691e5f2c7f3f773fcde4d1eed4d8cdcd67e968906745e9b50a1/merged major:0 minor:1070 fsType:overlay blockSize:0} overlay_0-1072:{mountpoint:/var/lib/containers/storage/overlay/fa55c84643539fae03b531bd5e042636113d7213a416208dd3bcb25c683b5693/merged major:0 minor:1072 fsType:overlay blockSize:0} overlay_0-1075:{mountpoint:/var/lib/containers/storage/overlay/09ab38c25938196851fa7c31d8cb07e423869723ff83f772bc3afa42837a6463/merged major:0 minor:1075 fsType:overlay blockSize:0} overlay_0-109:{mountpoint:/var/lib/containers/storage/overlay/8bcf9bdcc34a92b2026e82056758fa29c4f8bd19b6957af59445ee887a15b4c8/merged major:0 minor:109 fsType:overlay blockSize:0} overlay_0-1094:{mountpoint:/var/lib/containers/storage/overlay/02283eb0aefc94dba6a2bcab25c37cc5a2b073a36dd66cc9ea8866a7b8f77e8a/merged major:0 minor:1094 fsType:overlay blockSize:0} overlay_0-1096:{mountpoint:/var/lib/containers/storage/overlay/518073d92a09c7ce9ef275a41b88eacdf42678949ef515dd54bda2994051f1dc/merged major:0 minor:1096 fsType:overlay blockSize:0} overlay_0-1098:{mountpoint:/var/lib/containers/storage/overlay/146919cf4c2ada076ef0fc489eb60135b2f84497b20c7f84c494161690aa446a/merged major:0 minor:1098 fsType:overlay blockSize:0} overlay_0-1104:{mountpoint:/var/lib/containers/storage/overlay/0789a647108f3f2a9c5eb1459aa3cf0c6cd981090798d4efa94672e8c9005429/merged major:0 minor:1104 fsType:overlay blockSize:0} overlay_0-1114:{mountpoint:/var/lib/containers/storage/overlay/f61c3562cdc1ba2f486333bf7d0141b943ef5d23e40b8712f97e60b3eb64d60e/merged major:0 minor:1114 fsType:overlay blockSize:0} overlay_0-1125:{mountpoint:/var/lib/containers/storage/overlay/12f80acc7d1067b51b9170c820d2a9aea66fa17e5ac7af323bffb6c1f5b48abe/merged major:0 minor:1125 fsType:overlay blockSize:0} overlay_0-1130:{mountpoint:/var/lib/containers/storage/overlay/7e608370d58a1329c1fdd248bf231bcb33bc0f9f4bca8c919838c60aca78f4d1/merged major:0 minor:1130 fsType:overlay blockSize:0} overlay_0-1134:{mountpoint:/var/lib/containers/storage/overlay/7441b1e5d3d56431262f36f8d9a81d7a03f6aa5bfc7b37283770382b1f69ce29/merged major:0 minor:1134 fsType:overlay blockSize:0} overlay_0-1136:{mountpoint:/var/lib/containers/storage/overlay/f2c4a2cdd581346fa09f0418b467baaf3e4a4d7db6ac5cb8b921b3441dc31eb3/merged major:0 minor:1136 fsType:overlay blockSize:0} overlay_0-1148:{mountpoint:/var/lib/containers/storage/overlay/33ba2774e00f5b5693db0c809fe8206ba5f244257c7e449d17f8c7c54f36c163/merged major:0 minor:1148 fsType:overlay blockSize:0} overlay_0-1157:{mountpoint:/var/lib/containers/storage/overlay/e0174878fb06208427448ca9d821f6ce90f6025278b75dd2b3a6a46dd8525fa4/merged major:0 minor:1157 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/9f53ad407aa6750466461f31dfd38703cc3e2e3edf47253264cf6c9225d8de57/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-1162:{mountpoint:/var/lib/containers/storage/overlay/c2dcd50b9a7786545906c780af8802e3a4935152dc3d8ac74eed6ce88549a45e/merged major:0 minor:1162 fsType:overlay blockSize:0} overlay_0-1173:{mountpoint:/var/lib/containers/storage/overlay/421ceb967fb72b8339e30b72ca98b97ef95d2f86fa0f3d91e30ef8fa71d21d16/merged major:0 minor:1173 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/bca064206bf5f192337d981eea506766cbc8e42e2303c4baee2649a45d5b3198/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-129:{mountpoint:/var/lib/containers/storage/overlay/ae0170fa71b1afaee67f9931020f7edb95659a89d9f6f5e5f2dbcaa26d36f41e/merged major:0 minor:129 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/b84d62c1501ec0c901404e96a0934bb10d4703eb81b8872aa964453fdfe38e99/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/b6868d91e2f510d0d7d429fc1b6414c2c1cae3f86f79724b39556629095e1b5d/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/68091b7f6a577c845f3d91becae021d32a8c8f85f79e1176a475b8d47520288c/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/7073f091a6c5657ca4314c5a03ae9f229bc11c7e795ba62bbd1029d7f0177318/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/0e6ee2560fdcb0f0c13f465566a4b3c7ed7242b68ea3e9c79cf4eac8f3012cf6/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/1bc7dc69602705553258ca3faf0c402d94ca402a0d4c1f33af77bdb80157fd29/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-155:{mountpoint:/var/lib/containers/storage/overlay/68b58935e094ebe365e78abe46e1bfeddf21b69bc21ecd6cc641b74e55c4113f/merged major:0 minor:155 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/a31ccfef9ce0ddb516042c7972351251eceaf026d04fe9156d440fe8fc5b2e57/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/da0ca95ccc186dd2a05024ffa191c14fdb94a60e0125708f1b7fedf07554362c/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/995a36f63b99af5a3873445846c48b423f305b302119aa292503c2e23d7a0928/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/5f2e43e92011b2a5e570ac5378931476d882c3768ba6a6de4b37a8c66412a27e/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/9e3300bf56f97cbda957fe691341ee9b73339ec40de382a5b0001ab72b94430c/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/bc0dd8cc88ef64ec8cfe903077dd53528f8aefd18ef9f0a5fcdf15b5c0bcb6d9/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/0771ceed00491afc2390c5cc2d51dd174a9c72143076da9244eadcbe13cfdef3/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/aa4e9169af8e6742d6ba9252cffe67bc4547f68f3573a4d4895973fa6665b1bb/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/b35e881b8ddf864e9c22f1576096b331ebd4d5629c8f323c2675493161365fb6/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/98e53109cbc139608a9c5a6f9a06bff83dc90ac813ca303ed457573bf7e1adda/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/79f5cb6a92a2567d5bf4923e2617ed95203a1bad1b7a88a5289f5a395492492b/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/09d324d7e2c2d3c295ce1871fe4bba404be1551ef1b747b5b529f9f99d911d38/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/4b499bbd05372b43c9eefa7ed027c0793a5d47f8d5b5758276f1d458c7e36123/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/1963df254dacf9fa35ac3229d45db1f915610070a1bffa2757f8f2d97fa00cc0/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/af8da6b97b7c552ae861c8798865c65e6c96376f046e10f9cb574b9c89ff5ae2/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/a15b83df0e9b3f2e68d3868e07dac6db1518deeec3eb99eb50654b620ac07613/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/fe8ed84c5f02692408d6cb49a53386fe55c7b6f10dfb11e4b0b338aa8af48643/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/b0c1b0bc33127a174250fc6d0a25646906422aa82d78711a4ab8451216d9f1ce/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/5558f118d9b3a97727c4938ef2b1cb3d39672616b2980a71c559226d6236cffe/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/3c7c218429efc1a92d2c3854fddfe625b7f654cb9c05a59fab4ca5a2861b9eb7/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/de2e5801a25dafb20b1a9e08d6bc38d94edba2715f174cc379074d4edfffa536/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/bd586057a42d2d6ab3eb10e60ec74387467f74d16578afd7d4a72e2048c7b616/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/516c8832ebc1ce91db1cda6a3b88e34289fdac1a8407699fcd666c324de3f29b/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/bdccdfa165a11588a066b17865836563c68838eb6ce5d93123c4118446ace6b1/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/2cf6e6d258541867cac0e52bbadc776d8af7a6b46cb93e15049c6d4885fa5bd0/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-310:{mountpoint:/var/lib/containers/storage/overlay/41eb529aaa94aa0d57132514dd41c12787d769ed21c20891552e34402f7663a9/merged major:0 minor:310 fsType:overlay blockSize:0} overlay_0-318:{mountpoint:/var/lib/containers/storage/overlay/e45f670b22465c6ad97d251d50d6822bcb497e60199389946c8e4d39c72c1d7f/merged major:0 minor:318 fsType:overlay blockSize:0} overlay_0-320:{mountpoint:/var/lib/containers/storage/overlay/ac42b65757ae590c6d318110267e39f5ad8e674f88e7074eeb3d95faa7a73862/merged major:0 minor:320 fsType:overlay blockSize:0} overlay_0-323:{mountpoint:/var/lib/containers/storage/overlay/73dddb13e161f33c16d1b03d53e4d9fe9facecc5ccf8034f4416cb8b6cd2f1da/merged major:0 minor:323 fsType:overlay blockSize:0} overlay_0-324:{mountpoint:/var/lib/containers/storage/overlay/90b0b51a40a84e61b525f0f00ff25f34d21d82b923367a63bc051686d3eecd41/merged major:0 minor:324 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/c0e9fa0f3af51316fc959e02b51bb9f8257be8ddb5c690ca2d689d2f57d90fab/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-335:{mountpoint:/var/lib/containers/storage/overlay/046c517c1f0bfbda1f6abb240bb0d097fbe59f43cd49cf72b4ed41c122228c2b/merged major:0 minor:335 fsType:overlay blockSize:0} overlay_0-340:{mountpoint:/var/lib/containers/storage/overlay/b64889e25d128f0f15607474746bcd63664bf426c30a8105ee5925c8aa2dc90f/merged major:0 minor:340 fsType:overlay blockSize:0} overlay_0-343:{mountpoint:/var/lib/containers/storage/overlay/9e65d163704e284dad443f4be41a1a1c5315680bdfbc9ffcc5336f8d60e15cf6/merged major:0 minor:343 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/85ebd2f1de228e980089558a6ae212cc7e03738e83d7ea8a833ff3968c013563/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/62196a4eb112e3a868e71783f9dbe3ff1a04a0013d7337c3d466e900e8f63f4e/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-349:{mountpoint:/var/lib/containers/storage/overlay/ec71a2019edd755968478e3b76821b7b46630f0b30527e134515a23826f2e7e9/merged major:0 minor:349 fsType:overlay blockSize:0} overlay_0-360:{mountpoint:/var/lib/containers/storage/overlay/31256c402140c63206f65940c5273721bf2875b73e5272e3f98ea3ed2a18539a/merged major:0 minor:360 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/05d328b610871ccf67112f5b050b7dc238945a5a2207e375612eace02acb820c/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/222f5aa90a911343d9fe27da518c44027a262b917ae9e0dbdbb43c6a36a95c86/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-383:{mountpoint:/var/lib/containers/storage/overlay/5e665983d0b7b85558cb3c1e394125417cd73491ba0567271b61b55fcba3d824/merged major:0 minor:383 fsType:overlay blockSize:0} overlay_0-385:{mountpoint:/var/lib/containers/storage/overlay/20165025aa7d65a88ff51eda641a5bc8c64acec2d06e60829cb204b90dcf65c2/merged major:0 minor:385 fsType:overlay blockSize:0} overlay_0-402:{mountpoint:/var/lib/containers/storage/overlay/6061f87d03c123bd6150a874f54086d1b31b758742c02bfd0a31e11f61d56d01/merged major:0 minor:402 fsType:overlay blockSize:0} overlay_0-405:{mountpoint:/var/lib/containers/storage/overlay/79b7cfd0ebfa67b25494f277c4cee11f05991d7812c5494f904ffb0f04d5c01e/merged major:0 minor:405 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/18510d80c7aa60422ad4a1736af3f4c330ce52691593d4b47504f8f5e0360541/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/7fbae254a393164bfaa940daa593d2b0b30158c2e351479114b9c15028eb0e9b/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/34348a5ec85c6507111ff1a3e98307a3b3b8271a1f65ca4d8cbaaf8adae101fe/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-422:{mountpoint:/var/lib/containers/storage/overlay/2e4e2518669df7d3018fe8376cf4ec2e056834566d0d9cab5aec81af9eb41012/merged major:0 minor:422 fsType:overlay blockSize:0} overlay_0-432:{mountpoint:/var/lib/containers/storage/overlay/cb3526bd419ce4cbfaa8810f7acacb30d73363a89f4bfaa33fc0a3b7fd8fd2a5/merged major:0 minor:432 fsType:overlay blockSize:0} overlay_0-442:{mountpoint:/var/lib/containers/storage/overlay/cb89cfd606af0848890f22a6861dc3edd0f56edbf0fbc4729423adb7a52e353e/merged major:0 minor:442 fsType:overlay blockSize:0} overlay_0-446:{mountpoint:/var/lib/containers/storage/overlay/966718eea24fae7a4421f223a0f21d2702f89bcf59dcc0181b5ac9d60e3455bc/merged major:0 minor:446 fsType:overlay blockSize:0} overlay_0-448:{mountpoint:/var/lib/containers/storage/overlay/1b5ed1462a4f5854ffab9e57586f12525f4f0dd95898886202e3f125750849b4/merged major:0 minor:448 fsType:overlay blockSize:0} overlay_0-45:{mountpoint:/var/lib/containers/storage/overlay/02993dfaa26a8d094f5b60911a425e9eaf455452dc13de13b27cbd4003cd8d73/merged major:0 minor:45 fsType:overlay blockSize:0} overlay_0-454:{mountpoint:/var/lib/containers/storage/overlay/9a0a1d71d8c5fc6e55f6e0df672bbeba7f60c01cffe2b18d9ed7e39871ceae2d/merged major:0 minor:454 fsType:overlay blockSize:0} overlay_0-461:{mountpoint:/var/lib/containers/storage/overlay/7e2a45b7902a049e37c855c57d5da339a856059962a5c0854f80602d66d7671b/merged major:0 minor:461 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/9fda4e172b1b596523bc6e67a7405be93d7c9033f0cd501d8a68de1a944c7784/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-483:{mountpoint:/var/lib/containers/storage/overlay/34b1b4c199a2d0dc67a278acf1de1340f7d66b42056e369a2d25cf0dd6b93c24/merged major:0 minor:483 fsType:overlay blockSize:0} overlay_0-486:{mountpoint:/var/lib/containers/storage/overlay/d5e90e180fe2ecec1c53b549ddd5b7a891fdb3b324330eba78f86a90ce463ff4/merged major:0 minor:486 fsType:overlay blockSize:0} overlay_0-488:{mountpoint:/var/lib/containers/storage/overlay/0ed65b18ca0876298f133b379bdc498aaecbd8e3aed5ab1c4a7808c114f30a07/merged major:0 minor:488 fsType:overlay blockSize:0} overlay_0-490:{mountpoint:/var/lib/containers/storage/overlay/50d33d59a6c0c0d259e3d1be6aa9c9ad64f4f6c3e87cb1874d4df7f100c9b408/merged major:0 minor:490 fsType:overlay blockSize:0} overlay_0-492:{mountpoint:/var/lib/containers/storage/overlay/3c0d9667d489c88b95ce35543c4539210efd6f9a8baef17da52221f1bb20c459/merged major:0 minor:492 fsType:overlay blockSize:0} overlay_0-494:{mountpoint:/var/lib/containers/storage/overlay/95f64a1692d0e302fd68dd0b217d8045eec6e38cf594b7dfe308cb59024e1f78/merged major:0 minor:494 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/e4c8ca11220224c68f529f558d3b2adf3f513e7141cc4371040291e1254051e3/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-508:{mountpoint:/var/lib/containers/storage/overlay/edaae53d73afe078cd39daf6873e7c847ca5dcbeaf763555c4c3be66b64dc4c3/merged major:0 minor:508 fsType:overlay blockSize:0} overlay_0-510:{mountpoint:/var/lib/containers/storage/overlay/da95fb4a2fe3deba89f7d43a588b490a4d4db62e7510b068b5b4bef7ab248cc4/merged major:0 minor:510 fsType:overlay blockSize:0} overlay_0-511:{mountpoint:/var/lib/containers/storage/overlay/569ab9191a9871802128a3a309f493f3ddd3728766ce8cf644b97ea1f9c86fbf/merged major:0 minor:511 fsType:overlay blockSize:0} overlay_0-513:{mountpoint:/var/lib/containers/storage/overlay/d226335785158c285b8338497bfca186f2d0e3bdad1ed8cad8edbd29ce447d16/merged major:0 minor:513 fsType:overlay blockSize:0} overlay_0-515:{mountpoint:/var/lib/containers/storage/overlay/46028166e3dee477a489b6e195a73ebb28877d7c24b1ac3242495714de1215f5/merged major:0 minor:515 fsType:overlay blockSize:0} overlay_0-517:{mountpoint:/var/lib/containers/storage/overlay/2caea703dd92c66b4c9d4cdf8e285020db8e4be996f2635a2aa39252ce709938/merged major:0 minor:517 fsType:overlay blockSize:0} overlay_0-520:{mountpoint:/var/lib/containers/storage/overlay/b7d0b8db1ec172aff4f680c2c698fd06c688a03a01be868601a7394ec1c13390/merged major:0 minor:520 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/54e65e66502661be764d46147f6415116c0bc014547c05192b5984245802ab0a/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-528:{mountpoint:/var/lib/containers/storage/overlay/45b1810a2f503e4eb1324d82d77225dfd641a12bf19246b0223f2ec68a8270e5/merged major:0 minor:528 fsType:overlay blockSize:0} overlay_0-530:{mountpoint:/var/lib/containers/storage/overlay/32251acb6c74263ed078c9cd325d1662fd0a2207bd9ab78aa127b5a36ac0876d/merged major:0 minor:530 fsType:overlay blockSize:0} overlay_0-532:{mountpoint:/var/lib/containers/storage/overlay/15e11834f97bd9ae25803a86d957e28f3c795e32d7532938cc64d82bd9baaf86/merged major:0 minor:532 fsType:overlay blockSize:0} overlay_0-540:{mountpoint:/var/lib/containers/storage/overlay/039a024fdaaacca9c603bf0f93760d4506eab87537e1b8e9601837533eb235ef/merged major:0 minor:540 fsType:overlay blockSize:0} overlay_0-548:{mountpoint:/var/lib/containers/storage/overlay/54e9762d5f87f11a8d4f6a3b43baf53f603cdb588a6ac5af4e60fe550d8a5840/merged major:0 minor:548 fsType:overlay blockSize:0} overlay_0-555:{mountpoint:/var/lib/containers/storage/overlay/0037de9aa1daec6af3dbcba6ae8b917ce0c350cff276219b57851c7aab9bb7af/merged major:0 minor:555 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/f78f163f1c5f692c7f7cbba02bf5e68bc71718f847e8b3de5b69a635c4662941/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-561:{mountpoint:/var/lib/containers/storage/overlay/b450e27a04c166bdf4d87b4c5f48520f08cfd92bc721adba16267f76dba9dcd7/merged major:0 minor:561 fsType:overlay blockSize:0} overlay_0-565:{mountpoint:/var/lib/containers/storage/overlay/534de5e0469003800b9b0be5e6fe3bdfea78fcff78ad26de681266c2f24a1518/merged major:0 minor:565 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/10e335b17aa00b2c6806590e367c9f56db6db7daa70b07f08c50f97f7b2d54c5/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-577:{mountpoint:/var/lib/containers/storage/overlay/bb9b77c88a342ce8ffdb83392d2b636a95df94fa16030a190b1c044813249b86/merged major:0 minor:577 fsType:overlay blockSize:0} overlay_0-579:{mountpoint:/var/lib/containers/storage/overlay/319b8dba5884674d72ca7cfcd53a90cedd43f51e1ec0c259b31fa9cd7b7531fa/merged major:0 minor:579 fsType:overlay blockSize:0} overlay_0-584:{mountpoint:/var/lib/containers/storage/overlay/47efde2baf0aeed3cc25437b13006ea766d4254ae514cfab66b019b59a860f19/merged major:0 minor:584 fsType:overlay blockSize:0} overlay_0-586:{mountpoint:/var/lib/containers/storage/overlay/5251924a24e791157a121714e994f96f14b7d0e509acd0dc3af3af6206c0d46e/merged major:0 minor:586 fsType:overlay blockSize:0} overlay_0-597:{mountpoint:/var/lib/containers/storage/overlay/ae4c8e98be4d7c884f88443bdc5239c3658d1c5fedf758e036453c22fe5f769f/merged major:0 minor:597 fsType:overlay blockSize:0} overlay_0-600:{mountpoint:/var/lib/containers/storage/overlay/6cdcc24b708682ef5fe5f233f691fe4fca2468653343ff89c6e9c8d6a38156c3/merged major:0 minor:600 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/b09e07a6d567e6c951acfd26534f4fcc819a12f192d634b8d3e5ddc2cc0c9488/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/cb448f35193f5b923dd01e40f7ed85009868d8979ff87a8afa96ec1df3a7014a/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-621:{mountpoint:/var/lib/containers/storage/overlay/88618b0728eae3dee8978c3203be002caa59183be6815b97a56141b37dc46927/merged major:0 minor:621 fsType:overlay blockSize:0} overlay_0-624:{mountpoint:/var/lib/containers/storage/overlay/a94a5d09d356c37e1ec3dfb49632a23cb0ce0c2b2154c346f162e00ab59703c7/merged major:0 minor:624 fsType:overlay blockSize:0} overlay_0-627:{mountpoint:/var/lib/containers/storage/overlay/4a920b9469d837ee4614911440b4edbe66ce8922739086a854ae4095869ebc03/merged major:0 minor:627 fsType:overlay blockSize:0} overlay_0-63:{mountpoint:/var/lib/containers/storage/overlay/1c32b1108661dccc2ad9ee195770f1d975718538fdc33cd852db073335f04840/merged major:0 minor:63 fsType:overlay blockSize:0} overlay_0-631:{mountpoint:/var/lib/containers/storage/overlay/37dd6756b9bc733e24e9b47f232de315589dfa592d0743b1b185717a572b898c/merged major:0 minor:631 fsType:overlay blockSize:0} overlay_0-633:{mountpoint:/var/lib/containers/storage/overlay/7636cfd1b71169690e2f18e89fcaf34bf0784d556c8d7befccee32a52007b870/merged major:0 minor:633 fsType:overlay blockSize:0} overlay_0-636:{mountpoint:/var/lib/containers/storage/overlay/6bafb9272522956bd5b93d5075369dfd176803f627d276dce9013d360aee44c2/merged major:0 minor:636 fsType:overlay blockSize:0} overlay_0-657:{mountpoint:/var/lib/containers/storage/overlay/55bf34a7c47d8aa46caebbf8260476f8e72b30922973760100f5be20c0f48bb8/merged major:0 minor:657 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/54d2e8850f2f99f3aa6755cecb725306369624810aabb65ac327a08d4a8666d9/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-661:{mountpoint:/var/lib/containers/storage/overlay/ffd252ca0600f203f0c02fab5f3a8b993e7d334b1dcfa3be5691d8819a7b725e/merged major:0 minor:661 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/27c23a6df5e50d4756169da4d7effa233032c4ae41cd65246fc271bf121e6f0a/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/b461e4b2881c6e5b7297b9f843a551c8dbb0449510af88fee6ab8c1fa0ac97cb/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-690:{mountpoint:/var/lib/containers/storage/overlay/4ecf29a1f0c2935d1956c072e5f3b63d81bca689a28f6bfe945862b603f7e8ee/merged major:0 minor:690 fsType:overlay blockSize:0} overlay_0-694:{mountpoint:/var/lib/containers/storage/overlay/35ebb5f9a55efd9bf43f3c199e34bcf4453a93951935c51d8a8c263ca04e1690/merged major:0 minor:694 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/0bd2fab834f2e74124c740176e8b39ad10fbd2210046d8d454c3dd84d08f0219/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-712:{mountpoint:/var/lib/containers/storage/overlay/4220b69a491aeba4bc7cbcddc3b142f655e630176c43712b6f44376904973029/merged major:0 minor:712 fsType:overlay blockSize:0} overlay_0-721:{mountpoint:/var/lib/containers/storage/overlay/87d306b14db0e4cb05f59befc789ac8396964fb770ea4e33fd4784964ebe6e0b/merged major:0 minor:721 fsType:overlay blockSize:0} overlay_0-723:{mountpoint:/var/lib/containers/storage/overlay/5eb523369ceea4ca2826164844afd37aa842a74fe766e7f915cf065307f0618d/merged major:0 minor:723 fsType:overlay blockSize:0} overlay_0-725:{mountpoint:/var/lib/containers/storage/overlay/ce5c9912866a35077b0c4833f7309838a7da31fa8e950737070559b284654f39/merged major:0 minor:725 fsType:overlay blockSize:0} overlay_0-727:{mountpoint:/var/lib/containers/storage/overlay/0d04ba1eb66a87cd8056d59c26fee0f69a7cd91f97dcdf6eba866e30f7c734e4/merged major:0 minor:727 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/0091d15c919052c2dd951d131b624580fc0865704a8d968846170e786583b9b6/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-731:{mountpoint:/var/lib/containers/storage/overlay/4a1e054c18e3f6f2a464e3ed254241c95e4d8ee3267a5b865351fef39303331f/merged major:0 minor:731 fsType:overlay blockSize:0} overlay_0-738:{mountpoint:/var/lib/containers/storage/overlay/00cc6b45a1b740d1b4e35e1157b77cabed3c8980b16da5f08ef8956b4a37528f/merged major:0 minor:738 fsType:overlay blockSize:0} overlay_0-739:{mountpoint:/var/lib/containers/storage/overlay/e15a79b332083c0f9bf34fdc6ace3109c14e8b70ded800f29d68eb05573ea424/merged major:0 minor:739 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/606ccec1a6ced0a03fe902e55ac160015eefa14612e51a8b7ae1aa076851b17b/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-741:{mountpoint:/var/lib/containers/storage/overlay/c3de1a39d3d7c71a5d18fe395dc517cc0ed0827798b4013564b551815a754f0f/merged major:0 minor:741 fsType:overlay blockSize:0} overlay_0-743:{mountpoint:/var/lib/containers/storage/overlay/e2d106138bd513666279c229701160ccc9cf98245e13797d0613454226dd3443/merged major:0 minor:743 fsType:overlay blockSize:0} overlay_0-745:{mountpoint:/var/lib/containers/storage/overlay/8c6b5d01b6413068a4cf88aea30257273e47e3b4bd2b98b10af6706a86b78228/merged major:0 minor:745 fsType:overlay blockSize:0} overlay_0-755:{mountpoint:/var/lib/containers/storage/overlay/5e71f681caf03a0626e93e80e2631cffdbc5dc025bf811ca48f145770e846de8/merged major:0 minor:755 fsType:overlay blockSize:0} overlay_0-757:{mountpoint:/var/lib/containers/storage/overlay/c967844ac37667440fff0688a65611504d0f0bd1c2ce900ac4c618e45bec53bc/merged major:0 minor:757 fsType:overlay blockSize:0} overlay_0-766:{mountpoint:/var/lib/containers/storage/overlay/6cd95193b222553d53ef2cd6d28c26e5d9626397e41ba5f3ef758beb48302f3b/merged major:0 minor:766 fsType:overlay blockSize:0} overlay_0-778:{mountpoint:/var/lib/containers/storage/overlay/d2e0e7ca27a893e29f8ef3d46543c6be4209785b4e204464689b698fe598dc8c/merged major:0 minor:778 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/1e42306cfc6d2c80a77ec1a69273b10be34373280bf8bd2de494560399910e09/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-780:{mountpoint:/var/lib/containers/storage/overlay/811fdba8b09d7bcffc6c5f9daaa06527ab24340484cfefaf5fdcabc2c69594e7/merged major:0 minor:780 fsType:overlay blockSize:0} overlay_0-786:{mountpoint:/var/lib/containers/storage/overlay/5758a591792b869df2f2b0c3cbe807ecc40009a4b068628f193235a216cdff6a/merged major:0 minor:786 fsType:overlay blockSize:0} overlay_0-79:{mountpoint:/var/lib/containers/storage/overlay/073f57ae29239ac06ac4e9588e55ce11825b9d620e05ff8c99ff74ed40df0c96/merged major:0 minor:79 fsType:overlay blockSize:0} overlay_0-790:{mountpoint:/var/lib/containers/storage/overlay/4bcc0ac60bc33a5ba3bb2ef00eb06f341dc3f9d4b460ec8054c65bbc8d486ed9/merged major:0 minor:790 fsType:overlay blockSize:0} overlay_0-793:{mountpoint:/var/lib/containers/storage/overlay/26cc094ffaacad27537ca10c408ba52bafa8f5025808ade2d946b40a39274461/merged major:0 minor:793 fsType:overlay blockSize:0} overlay_0-795:{mountpoint:/var/lib/containers/storage/overlay/f9648d2180cb887d9c2dd0993810a25108e71b86f78434059a3c51fad7837e35/merged major:0 minor:795 fsType:overlay blockSize:0} overlay_0-797:{mountpoint:/var/lib/containers/storage/overlay/6d5c9a76fac1821621f4992453a8b553fbe6267572c6ab89335e382017e26074/merged major:0 minor:797 fsType:overlay blockSize:0} overlay_0-800:{mountpoint:/var/lib/containers/storage/overlay/eefbeeb6597db42bc0a5e1f9d0007bbde0104cdc0c7632083ae48aadb387d289/merged major:0 minor:800 fsType:overlay blockSize:0} overlay_0-807:{mountpoint:/var/lib/containers/storage/overlay/417e55f5fc4c26881db58bea339186e9911fa36adc8ab76923e5d2504b8eaa7e/merged major:0 minor:807 fsType:overlay blockSize:0} overlay_0-817:{mountpoint:/var/lib/containers/storage/overlay/8e6cba11d940c33d1af39d4351bda08dc81b447a53c230ca47ed63ab17a8a04e/merged major:0 minor:817 fsType:overlay blockSize:0} overlay_0-83:{mountpoint:/var/lib/containers/storage/overlay/cc5f4e00f946bc69111e57ea01bb846d358a2f90a9fccaa8edc865ffe4e41126/merged major:0 minor:83 fsType:overlay blockSize:0} overlay_0-836:{mountpoint:/var/lib/containers/storage/overlay/d1a4bed2c13ef2349eec76c66a1c07fe1d0bdeecec572bdedfebec95857db7b1/merged major:0 minor:836 fsType:overlay blockSize:0} overlay_0-849:{mountpoint:/var/lib/containers/storage/overlay/89adc8b71f9387df2b0235d8ad0e343b2ad3011ca51a415eee73268574a4ed4a/merged major:0 minor:849 fsType:overlay blockSize:0} overlay_0-851:{mountpoint:/var/lib/containers/storage/overlay/2553ed0f4f68cd140710c9b1b0df9c00aaf4b17456c4feae70e7c2f51bfdc122/merged major:0 minor:851 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/b708bcc1746e57c86135174787e489ebb72379827e8e7a843d2602a961887881/merged major:0 minor:86 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/16b128f3206fcea7966b37b6023b617b84f938d6b753a1905f7edaf0850e9f22/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/523c164b3f12f5d5d0d159f9dc120b64ee8f618cec4701c41c5e1789dd557893/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-876:{mountpoint:/var/lib/containers/storage/overlay/7a07c6e89984acec2bbc1be5189b227846ccd3e64e1d2c207b134e9fd256bd0f/merged major:0 minor:876 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/d4a59f972d6ac48f9d6555f1deb3b02953bc67c3ccfe9a0a0c9f98654fdf37b4/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-887:{mountpoint:/var/lib/containers/storage/overlay/1b4b8d84483473d290a1dc1e4c761bf042698a6122deab5faddd855aca4e1bb4/merged major:0 minor:887 fsType:overlay blockSize:0} overlay_0-898:{mountpoint:/var/lib/containers/storage/overlay/25015fe5da557931ba7baddd5a9a505b54e42ef9987239c15ca4ca4bd0210889/merged major:0 minor:898 fsType:overlay blockSize:0} overlay_0-90:{mountpoint:/var/lib/containers/storage/overlay/021598a96ebdff98bb7a6f255f668b84f4f545ad30db9cc49d36fdd6d9cdd0e6/merged major:0 minor:90 fsType:overlay blockSize:0} overlay_0-900:{mountpoint:/var/lib/containers Mar 13 01:18:59.284075 master-0 kubenswrapper[19170]: /storage/overlay/c09dba55abf4b5e874c52a1f611c7bbdaf9b295d59df5d4f938f195fddcaf543/merged major:0 minor:900 fsType:overlay blockSize:0} overlay_0-912:{mountpoint:/var/lib/containers/storage/overlay/2d9a81f253eda0516739172438f221b09857fd06d4f86452c7eaebdc9c8b8f2d/merged major:0 minor:912 fsType:overlay blockSize:0} overlay_0-918:{mountpoint:/var/lib/containers/storage/overlay/8d46dc49496dfb46e63d244480ca581c92be969262305b41d1e71a9c9831e918/merged major:0 minor:918 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/5608d9b19c887f5dd0f9288914e1652919dc08776d28af82028ac9f0dcd970f7/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-920:{mountpoint:/var/lib/containers/storage/overlay/711ec45f4bcfba170eb529c46e43a9b6b06265f3648833787ac43c1d678eb198/merged major:0 minor:920 fsType:overlay blockSize:0} overlay_0-922:{mountpoint:/var/lib/containers/storage/overlay/2515bb8aefee9444e1c31f3c677d7d6197ae6f53afc24ebe272df85613de0a3a/merged major:0 minor:922 fsType:overlay blockSize:0} overlay_0-928:{mountpoint:/var/lib/containers/storage/overlay/0faafeac313a3e5e704f6282f525ad08abc4ed52811a41469727d71ae2962aed/merged major:0 minor:928 fsType:overlay blockSize:0} overlay_0-952:{mountpoint:/var/lib/containers/storage/overlay/f50749229bbd6ce03f343aa16a200a475a0e29f908ec9258d84abc5da35976b5/merged major:0 minor:952 fsType:overlay blockSize:0} overlay_0-954:{mountpoint:/var/lib/containers/storage/overlay/0cab7e5d4747bbd8a775b1fffabdb974618a56d0d15c633213cc09e9e242f29d/merged major:0 minor:954 fsType:overlay blockSize:0} overlay_0-957:{mountpoint:/var/lib/containers/storage/overlay/7a042f7320e0606ca92633a9c3c6fd1c144d7d0de306fc2c5964047ca38bd588/merged major:0 minor:957 fsType:overlay blockSize:0} overlay_0-966:{mountpoint:/var/lib/containers/storage/overlay/aefa25982e53a787477eae9ba3ab11a84f19062d5b36709ce97e90885ba2c76e/merged major:0 minor:966 fsType:overlay blockSize:0} overlay_0-968:{mountpoint:/var/lib/containers/storage/overlay/b1f651622cddfb4648e72b6492e4bd9a2520a4cb6619b88954002ca217d0be1f/merged major:0 minor:968 fsType:overlay blockSize:0} overlay_0-970:{mountpoint:/var/lib/containers/storage/overlay/3a9b2eb9a9464f68c8fd95bef809dd3a05190e30077bcc30d99eb63d8a841ffc/merged major:0 minor:970 fsType:overlay blockSize:0} overlay_0-972:{mountpoint:/var/lib/containers/storage/overlay/417eb74ab7e6bcee301a2713daa061bac31e8230593cdcf8c559e7341db4d17c/merged major:0 minor:972 fsType:overlay blockSize:0} overlay_0-978:{mountpoint:/var/lib/containers/storage/overlay/3095a53a15e61a4a6d745a389042d04c5129da92d545f4919c99d7f027f8ac76/merged major:0 minor:978 fsType:overlay blockSize:0} overlay_0-986:{mountpoint:/var/lib/containers/storage/overlay/4d1e236dec28472651b6e7c383c9d3d3b37455c7e3a00c58129b0f638e12e6d4/merged major:0 minor:986 fsType:overlay blockSize:0} overlay_0-993:{mountpoint:/var/lib/containers/storage/overlay/f0085958ea8f382c6bb75fef6a0131037dd7c270e5433065faf49f88a777c1ef/merged major:0 minor:993 fsType:overlay blockSize:0}] Mar 13 01:18:59.310242 master-0 kubenswrapper[19170]: I0313 01:18:59.308487 19170 manager.go:217] Machine: {Timestamp:2026-03-13 01:18:59.307711326 +0000 UTC m=+0.115832306 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d267e942efb840478924173af659d9c8 SystemUUID:d267e942-efb8-4047-8924-173af659d9c8 BootID:beebd46b-80cb-4497-a098-674e9838eb1c Filesystems:[{Device:overlay_0-446 DeviceMajor:0 DeviceMinor:446 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8/userdata/shm DeviceMajor:0 DeviceMinor:708 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00/userdata/shm DeviceMajor:0 DeviceMinor:241 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-520 DeviceMajor:0 DeviceMinor:520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-597 DeviceMajor:0 DeviceMinor:597 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-797 DeviceMajor:0 DeviceMinor:797 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:862 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1098 DeviceMajor:0 DeviceMinor:1098 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/kube-api-access-fzklz DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973/userdata/shm DeviceMajor:0 DeviceMinor:371 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~projected/kube-api-access-dqsdm DeviceMajor:0 DeviceMinor:947 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1010 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-579 DeviceMajor:0 DeviceMinor:579 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:609 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-743 DeviceMajor:0 DeviceMinor:743 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:445 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1104 DeviceMajor:0 DeviceMinor:1104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~projected/kube-api-access-7mnf5 DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:452 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70/userdata/shm DeviceMajor:0 DeviceMinor:289 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9/userdata/shm DeviceMajor:0 DeviceMinor:505 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/kube-api-access-29w76 DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff/userdata/shm DeviceMajor:0 DeviceMinor:97 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-320 DeviceMajor:0 DeviceMinor:320 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-360 DeviceMajor:0 DeviceMinor:360 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1007 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b656a6f1c70c3edb8a88d273e10ec19afe3e617046ee184903275fabe65867b3/userdata/shm DeviceMajor:0 DeviceMinor:714 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-324 DeviceMajor:0 DeviceMinor:324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82/userdata/shm DeviceMajor:0 DeviceMinor:1082 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-631 DeviceMajor:0 DeviceMinor:631 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820/userdata/shm DeviceMajor:0 DeviceMinor:788 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1029 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b26809d00df2d88f0387eef7498f3d90150a196ebaa102f4f43bf51209c487a9/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f/userdata/shm DeviceMajor:0 DeviceMinor:301 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-422 DeviceMajor:0 DeviceMinor:422 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:471 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:813 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-343 DeviceMajor:0 DeviceMinor:343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~projected/kube-api-access-4n9fb DeviceMajor:0 DeviceMinor:934 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-83 DeviceMajor:0 DeviceMinor:83 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3/userdata/shm DeviceMajor:0 DeviceMinor:672 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~projected/kube-api-access-5zd92 DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:465 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96/userdata/shm DeviceMajor:0 DeviceMinor:479 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~projected/kube-api-access-kd849 DeviceMajor:0 DeviceMinor:504 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-515 DeviceMajor:0 DeviceMinor:515 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-745 DeviceMajor:0 DeviceMinor:745 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-45 DeviceMajor:0 DeviceMinor:45 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1148 DeviceMajor:0 DeviceMinor:1148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-310 DeviceMajor:0 DeviceMinor:310 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~projected/kube-api-access-n4rfg DeviceMajor:0 DeviceMinor:413 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-432 DeviceMajor:0 DeviceMinor:432 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:496 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1057 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/64477504-5cb6-42dc-a7eb-662981daec4a/volumes/kubernetes.io~projected/kube-api-access-gg77t DeviceMajor:0 DeviceMinor:366 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~projected/kube-api-access-sddd9 DeviceMajor:0 DeviceMinor:313 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6f8519eea623420c40da808c6cfff53da6452162ecb364a1c82aa4dfe3545fe2/userdata/shm DeviceMajor:0 DeviceMinor:603 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-510 DeviceMajor:0 DeviceMinor:510 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-109 DeviceMajor:0 DeviceMinor:109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-661 DeviceMajor:0 DeviceMinor:661 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:469 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a628e92ac4f34b60f238b76d4fc08c8cab73f3dfd7d9d1150c95d95292472f21/userdata/shm DeviceMajor:0 DeviceMinor:710 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~projected/kube-api-access-jvn9n DeviceMajor:0 DeviceMinor:395 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-624 DeviceMajor:0 DeviceMinor:624 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:861 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b3a9c0f6-cfde-4ae8-952a-00e2fb862482/volumes/kubernetes.io~projected/kube-api-access-42sqh DeviceMajor:0 DeviceMinor:681 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1136 DeviceMajor:0 DeviceMinor:1136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~projected/kube-api-access-zjhjj DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~projected/kube-api-access-74tvv DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19ce38027ae9e3d0076b8c83191fabde1e4e81b393c760835578ba3bc36b41b2/userdata/shm DeviceMajor:0 DeviceMinor:1046 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-694 DeviceMajor:0 DeviceMinor:694 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:701 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d/userdata/shm DeviceMajor:0 DeviceMinor:680 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-968 DeviceMajor:0 DeviceMinor:968 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1134 DeviceMajor:0 DeviceMinor:1134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~projected/kube-api-access-jkjrm DeviceMajor:0 DeviceMinor:1028 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:466 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-528 DeviceMajor:0 DeviceMinor:528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:943 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-405 DeviceMajor:0 DeviceMinor:405 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-721 DeviceMajor:0 DeviceMinor:721 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-565 DeviceMajor:0 DeviceMinor:565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~projected/kube-api-access-mmvs5 DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/eeedfbb568950a2005b49c940a6eb5e45d4af2d8ddb401839d8110cff9f9ae07/userdata/shm DeviceMajor:0 DeviceMinor:472 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-561 DeviceMajor:0 DeviceMinor:561 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6c88187c-d011-4043-a6d3-4a8a7ec4e204/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:702 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:315 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-442 DeviceMajor:0 DeviceMinor:442 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-786 DeviceMajor:0 DeviceMinor:786 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-586 DeviceMajor:0 DeviceMinor:586 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284/userdata/shm DeviceMajor:0 DeviceMinor:1036 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~projected/kube-api-access-vxf58 DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-532 DeviceMajor:0 DeviceMinor:532 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-402 DeviceMajor:0 DeviceMinor:402 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-918 DeviceMajor:0 DeviceMinor:918 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~projected/kube-api-access-6tfdv DeviceMajor:0 DeviceMinor:131 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/50afa0bafdfdadd430cb50b2aa81b0c11200da9c802e7cb966b1902e4941db5a/userdata/shm DeviceMajor:0 DeviceMinor:44 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-972 DeviceMajor:0 DeviceMinor:972 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~projected/kube-api-access-rtrb2 DeviceMajor:0 DeviceMinor:1080 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1157 DeviceMajor:0 DeviceMinor:1157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c/userdata/shm DeviceMajor:0 DeviceMinor:715 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d81bcb58-efe3-4577-8e88-67f92c645f6f/volumes/kubernetes.io~projected/kube-api-access-k7wj9 DeviceMajor:0 DeviceMinor:589 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-725 DeviceMajor:0 DeviceMinor:725 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:464 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~projected/kube-api-access-7f82n DeviceMajor:0 DeviceMinor:875 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1008 DeviceMajor:0 DeviceMinor:1008 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1030 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1094 DeviceMajor:0 DeviceMinor:1094 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-986 DeviceMajor:0 DeviceMinor:986 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-530 DeviceMajor:0 DeviceMinor:530 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d87d8f00b3827a6cc0d679f67563557686bd72d154906a3035b8f36d3110e48e/userdata/shm DeviceMajor:0 DeviceMinor:706 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f9728b4-4e1e-4165-a276-3daa00e95839/volumes/kubernetes.io~projected/kube-api-access-xr6vn DeviceMajor:0 DeviceMinor:500 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-920 DeviceMajor:0 DeviceMinor:920 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1002 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1062 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:623 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~projected/kube-api-access-jt79p DeviceMajor:0 DeviceMinor:685 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679/userdata/shm DeviceMajor:0 DeviceMinor:1048 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-79 DeviceMajor:0 DeviceMinor:79 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/92c3b2f339c88995c70507230cfa25808d3c4399c710b2454c51839f6048ccf5/userdata/shm DeviceMajor:0 DeviceMinor:704 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~projected/kube-api-access-kkdbm DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46662e51-44af-4732-83a1-9509a579b373/volumes/kubernetes.io~projected/kube-api-access-m5n7m DeviceMajor:0 DeviceMinor:246 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:503 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-633 DeviceMajor:0 DeviceMinor:633 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-636 DeviceMajor:0 DeviceMinor:636 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-954 DeviceMajor:0 DeviceMinor:954 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1064 DeviceMajor:0 DeviceMinor:1064 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-90 DeviceMajor:0 DeviceMinor:90 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498/userdata/shm DeviceMajor:0 DeviceMinor:1065 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:128 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f9b713fb-64ce-4a01-951c-1f31df62e1ae/volumes/kubernetes.io~projected/kube-api-access-4chtg DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1125 DeviceMajor:0 DeviceMinor:1125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1130 DeviceMajor:0 DeviceMinor:1130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-790 DeviceMajor:0 DeviceMinor:790 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-600 DeviceMajor:0 DeviceMinor:600 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-741 DeviceMajor:0 DeviceMinor:741 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2937cbe2-3125-4c3f-96f8-2febeb5942cc/volumes/kubernetes.io~projected/kube-api-access-spxfj DeviceMajor:0 DeviceMinor:110 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/486c7e33-3dd8-4a98-87e3-8216ee2e05ef/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-513 DeviceMajor:0 DeviceMinor:513 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30f7537e-93ed-466b-ba24-78141d004b2f/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1006 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~projected/kube-api-access-bz7v9 DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/95d4e785-6663-417d-b380-6905773613c8/volumes/kubernetes.io~projected/kube-api-access-nhcll DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~projected/kube-api-access-wg54c DeviceMajor:0 DeviceMinor:256 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-461 DeviceMajor:0 DeviceMinor:461 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c/userdata/shm DeviceMajor:0 DeviceMinor:805 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:602 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1114 DeviceMajor:0 DeviceMinor:1114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134/userdata/shm DeviceMajor:0 DeviceMinor:688 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-912 DeviceMajor:0 DeviceMinor:912 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-900 DeviceMajor:0 DeviceMinor:900 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~projected/kube-api-access-898lt DeviceMajor:0 DeviceMinor:958 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822/userdata/shm DeviceMajor:0 DeviceMinor:713 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~projected/kube-api-access-929r9 DeviceMajor:0 DeviceMinor:314 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-800 DeviceMajor:0 DeviceMinor:800 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ac2a4c90-32db-4464-8c47-acbcafbcd5d0/volumes/kubernetes.io~projected/kube-api-access-sw5hs DeviceMajor:0 DeviceMinor:1034 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4edb3e1a-9082-4fc2-ae6f-99d49c078a34/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3/userdata/shm DeviceMajor:0 DeviceMinor:964 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0072057986d1a9c35e19db3f7ab2650e875b4c3fecae35f046b875511fe06154/userdata/shm DeviceMajor:0 DeviceMinor:60 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-454 DeviceMajor:0 DeviceMinor:454 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:791 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1053 DeviceMajor:0 DeviceMinor:1053 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~projected/kube-api-access-jg7x6 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78d2cd80-23b9-426d-a7ac-1daa27668a47/volumes/kubernetes.io~projected/kube-api-access-mxctn DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-793 DeviceMajor:0 DeviceMinor:793 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-795 DeviceMajor:0 DeviceMinor:795 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-63 DeviceMajor:0 DeviceMinor:63 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1003 DeviceMajor:0 DeviceMinor:1003 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86/userdata/shm DeviceMajor:0 DeviceMinor:477 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-778 DeviceMajor:0 DeviceMinor:778 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cada5bf2-e208-4fd8-bdf5-de8cad31a665/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:311 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-448 DeviceMajor:0 DeviceMinor:448 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:89 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:572 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-957 DeviceMajor:0 DeviceMinor:957 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1011 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-723 DeviceMajor:0 DeviceMinor:723 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1070 DeviceMajor:0 DeviceMinor:1070 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4738c93d-62e6-44ce-a289-e646b9302e71/volumes/kubernetes.io~projected/kube-api-access-9gt69 DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:304 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d23bbaec-b635-4649-b26e-2829f32d21f0/volumes/kubernetes.io~projected/kube-api-access-szx9m DeviceMajor:0 DeviceMinor:873 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1308fba1-a50d-48b3-b272-7bef44727b7f/volumes/kubernetes.io~projected/kube-api-access-zv2rb DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3f750f4eaadd11866936791933f7a3cbf786b838bf1e7a9f9142487b42787b0b/userdata/shm DeviceMajor:0 DeviceMinor:473 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-488 DeviceMajor:0 DeviceMinor:488 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:573 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae44526f-5858-42a0-ba77-3a22f171456f/volumes/kubernetes.io~projected/kube-api-access-mz8jz DeviceMajor:0 DeviceMinor:270 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1055 DeviceMajor:0 DeviceMinor:1055 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:499 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~projected/kube-api-access-tl6k6 DeviceMajor:0 DeviceMinor:613 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c/userdata/shm DeviceMajor:0 DeviceMinor:991 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ca2fa86b-a966-49dc-8577-d2b54b111d14/volumes/kubernetes.io~projected/kube-api-access-gwm5w DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-486 DeviceMajor:0 DeviceMinor:486 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-323 DeviceMajor:0 DeviceMinor:323 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-922 DeviceMajor:0 DeviceMinor:922 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1075 DeviceMajor:0 DeviceMinor:1075 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~projected/kube-api-access-zdzjn DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/34d6500d42674d3ac28ad1da03d31ad6fc07a588196014c4a73a86965dd9deb9/userdata/shm DeviceMajor:0 DeviceMinor:798 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1066 DeviceMajor:0 DeviceMinor:1066 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a5ab1d5-dabd-45e7-a688-71a282f61e67/volumes/kubernetes.io~projected/kube-api-access-lpvtc DeviceMajor:0 DeviceMinor:575 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-627 DeviceMajor:0 DeviceMinor:627 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a/volumes/kubernetes.io~projected/kube-api-access-6hhwp DeviceMajor:0 DeviceMinor:799 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1005 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ea2257b817f7a593cf8a5bc18fd54c7de892a301e19617876be4cc31d01237b/userdata/shm DeviceMajor:0 DeviceMinor:331 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-690 DeviceMajor:0 DeviceMinor:690 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/039acb44-a9b3-4ad6-a091-be4d18edc34f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:502 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79/userdata/shm DeviceMajor:0 DeviceMinor:475 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3d2e7338-a6d6-4872-ab72-a4e631075ab3/volumes/kubernetes.io~projected/kube-api-access-vkgvg DeviceMajor:0 DeviceMinor:421 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:856 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9abb90df1fb36f7d743ddb849ea400a46f15eae6ffadde3a44f5e1ad0528227b/userdata/shm DeviceMajor:0 DeviceMinor:69 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5f0a23a29ec1be227442f950c7b43af141e31a2152ab46cc286a5229950b1bae/userdata/shm DeviceMajor:0 DeviceMinor:1031 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-555 DeviceMajor:0 DeviceMinor:555 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/916d9fc9-388b-4506-a17c-36a7f626356a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1096 DeviceMajor:0 DeviceMinor:1096 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1077 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/13fac7b0-ce55-467d-9d0c-6a122d87cb3c/volumes/kubernetes.io~projected/kube-api-access-wh2bv DeviceMajor:0 DeviceMinor:262 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36/userdata/shm DeviceMajor:0 DeviceMinor:950 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-657 DeviceMajor:0 DeviceMinor:657 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/93871019-3d0c-4081-9afe-19b6dd108ec6/volumes/kubernetes.io~projected/kube-api-access-s9nfk DeviceMajor:0 DeviceMinor:809 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/21cbea73-f779-43e4-b5ba-d6fa06275d34/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d56480e0-0885-41e5-a1fc-931a068fbadb/volumes/kubernetes.io~projected/kube-api-access-fppkf DeviceMajor:0 DeviceMinor:263 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/da44d750-31e5-46f4-b3ef-dd4384c22aaf/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:412 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-851 DeviceMajor:0 DeviceMinor:851 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2b5ad07-fa01-4330-9dce-6da3444657ab/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1074 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:611 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-483 DeviceMajor:0 DeviceMinor:483 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-836 DeviceMajor:0 DeviceMinor:836 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb/userdata/shm DeviceMajor:0 DeviceMinor:130 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877/userdata/shm DeviceMajor:0 DeviceMinor:440 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-849 DeviceMajor:0 DeviceMinor:849 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vf Mar 13 01:18:59.310751 master-0 kubenswrapper[19170]: s Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1026 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-383 DeviceMajor:0 DeviceMinor:383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-548 DeviceMajor:0 DeviceMinor:548 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272/userdata/shm DeviceMajor:0 DeviceMinor:453 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7f35cc1e-3376-4dbd-b215-2a32bf62cc71/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:699 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-727 DeviceMajor:0 DeviceMinor:727 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-492 DeviceMajor:0 DeviceMinor:492 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3ac90b7e141885c73870d9744a9126cb8648da1eed1822b13844b812ecb6dc82/userdata/shm DeviceMajor:0 DeviceMinor:948 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-340 DeviceMajor:0 DeviceMinor:340 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-385 DeviceMajor:0 DeviceMinor:385 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:678 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-738 DeviceMajor:0 DeviceMinor:738 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/volumes/kubernetes.io~projected/kube-api-access-6v79j DeviceMajor:0 DeviceMinor:498 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-584 DeviceMajor:0 DeviceMinor:584 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-577 DeviceMajor:0 DeviceMinor:577 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-807 DeviceMajor:0 DeviceMinor:807 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-712 DeviceMajor:0 DeviceMinor:712 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-898 DeviceMajor:0 DeviceMinor:898 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/85149f21-7ba8-4891-82ef-0fef3d5d7863/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e68ab3cb-c372-45d9-a758-beaf4c213714/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:703 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334/userdata/shm DeviceMajor:0 DeviceMinor:840 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-966 DeviceMajor:0 DeviceMinor:966 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1162 DeviceMajor:0 DeviceMinor:1162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b018527cc19e60b658984a3b2cf8d02fa83e221b23e0763c86d4b53c72e80c7e/userdata/shm DeviceMajor:0 DeviceMinor:619 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:693 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:698 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1042 DeviceMajor:0 DeviceMinor:1042 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d/userdata/shm DeviceMajor:0 DeviceMinor:99 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d5456c8b-3c98-4824-8700-a04e9c12fb2e/volumes/kubernetes.io~projected/kube-api-access-mnxgm DeviceMajor:0 DeviceMinor:309 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~projected/kube-api-access-w2m48 DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-876 DeviceMajor:0 DeviceMinor:876 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8/userdata/shm DeviceMajor:0 DeviceMinor:1116 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1016 DeviceMajor:0 DeviceMinor:1016 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-887 DeviceMajor:0 DeviceMinor:887 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-970 DeviceMajor:0 DeviceMinor:970 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~projected/kube-api-access-8gj56 DeviceMajor:0 DeviceMinor:96 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e/userdata/shm DeviceMajor:0 DeviceMinor:679 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2ce47660-f7cc-4669-a00d-83422f0f6d55/volumes/kubernetes.io~projected/kube-api-access-d6frm DeviceMajor:0 DeviceMinor:869 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1038 DeviceMajor:0 DeviceMinor:1038 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b70fd8156b9269ea1d32e5bd6b505f43cc5c2cda9055f9eab294a1ae160205e2/userdata/shm DeviceMajor:0 DeviceMinor:684 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8dd90e4b919a4750dacd366cb8ce8129d02c4f3f75302771450ca85e994151e/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-731 DeviceMajor:0 DeviceMinor:731 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1040 DeviceMajor:0 DeviceMinor:1040 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1001 DeviceMajor:0 DeviceMinor:1001 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4c5174b9-ca9e-4917-ab3a-ca403ce4f017/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:470 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac/userdata/shm DeviceMajor:0 DeviceMinor:842 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1050 DeviceMajor:0 DeviceMinor:1050 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:95 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-103 DeviceMajor:0 DeviceMinor:103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-766 DeviceMajor:0 DeviceMinor:766 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da/userdata/shm DeviceMajor:0 DeviceMinor:68 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-621 DeviceMajor:0 DeviceMinor:621 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-101 DeviceMajor:0 DeviceMinor:101 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b/volumes/kubernetes.io~projected/kube-api-access-7gmkr DeviceMajor:0 DeviceMinor:835 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-517 DeviceMajor:0 DeviceMinor:517 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-508 DeviceMajor:0 DeviceMinor:508 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e/userdata/shm DeviceMajor:0 DeviceMinor:476 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/94e09a1dd75367c606e9c0f6209f6e945683271c1483d15ae30d37382e33a6c7/userdata/shm DeviceMajor:0 DeviceMinor:848 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/48375ae2-d4b4-4db4-b832-3e3db1834fb9/volumes/kubernetes.io~projected/kube-api-access-q7mn4 DeviceMajor:0 DeviceMinor:1032 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/339cc6449a0020231eef0158a934d4ae19f59a10f226d56a246c3dc49a8eebbe/userdata/shm DeviceMajor:0 DeviceMinor:705 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ebf338e6-9725-47d9-8c7f-adbf11a44406/volumes/kubernetes.io~projected/kube-api-access-ltxpc DeviceMajor:0 DeviceMinor:815 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/33dfdc31-54a4-4249-99ae-a15180514659/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:956 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4976e608-07a0-4cef-8fdd-7cec3324b4b5/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:700 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-739 DeviceMajor:0 DeviceMinor:739 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-349 DeviceMajor:0 DeviceMinor:349 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58405741-598c-4bf5-bbc8-1ca8e3f10995/volumes/kubernetes.io~projected/kube-api-access-6qlks DeviceMajor:0 DeviceMinor:588 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3/userdata/shm DeviceMajor:0 DeviceMinor:84 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bfc49699-9428-4bff-804d-da0e60551759/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:94 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58035e42-37d8-48f6-9861-9b4ce6014119/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-494 DeviceMajor:0 DeviceMinor:494 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/22587300-2448-4862-9fd8-68197d17a9f2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/738ebdcd-b78b-495a-b8f2-84af11a7d35c/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:610 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-817 DeviceMajor:0 DeviceMinor:817 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d0d7ba4bdbd45759b508d00f36d1e06281f843bb6e1de6ed64932952a8078e77/userdata/shm DeviceMajor:0 DeviceMinor:414 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-757 DeviceMajor:0 DeviceMinor:757 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-155 DeviceMajor:0 DeviceMinor:155 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-928 DeviceMajor:0 DeviceMinor:928 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1072 DeviceMajor:0 DeviceMinor:1072 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2c4c579b-0643-47ac-a729-017c326b0ecc/volumes/kubernetes.io~projected/kube-api-access-g44dw DeviceMajor:0 DeviceMinor:497 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/57eb2020-1560-4352-8b86-76db59de933a/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:501 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bdd5af34bfad236139e626fcebfb16719c123b2551b988ca1c04bcedf0b2fdb1/userdata/shm DeviceMajor:0 DeviceMinor:506 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/edda0d03-fdb2-4130-8f73-8057efd5815c/volumes/kubernetes.io~projected/kube-api-access-h5gmv DeviceMajor:0 DeviceMinor:1063 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-780 DeviceMajor:0 DeviceMinor:780 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-129 DeviceMajor:0 DeviceMinor:129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-993 DeviceMajor:0 DeviceMinor:993 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8c6bf2d5-1881-4b63-b247-7e7426707fa1/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:467 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f97819d0-2840-4352-a435-19ef1a8c22c9/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:468 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a/volumes/kubernetes.io~projected/kube-api-access-pwxhc DeviceMajor:0 DeviceMinor:1018 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/61fb4b86-f978-4ae1-80bc-18d2f386cbc2/volumes/kubernetes.io~projected/kube-api-access-lrf2s DeviceMajor:0 DeviceMinor:258 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-511 DeviceMajor:0 DeviceMinor:511 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/77fd9062-0f7d-4255-92ca-7e4325daeddd/volumes/kubernetes.io~projected/kube-api-access-vtqjr DeviceMajor:0 DeviceMinor:872 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22/volumes/kubernetes.io~projected/kube-api-access-wwnml DeviceMajor:0 DeviceMinor:838 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2679c6e1-11c1-450c-b03a-30d7ee59ff6f/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:400 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0671fdd0-b358-40f9-ae49-2c5a9004edb3/volumes/kubernetes.io~projected/kube-api-access-ftn5x DeviceMajor:0 DeviceMinor:1035 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cd7cca05-3da7-42cf-af64-6e94050e58c0/volumes/kubernetes.io~projected/kube-api-access-gx8zl DeviceMajor:0 DeviceMinor:1027 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d1153bb3-30dd-458f-b0a4-c05358a8b3f8/volumes/kubernetes.io~projected/kube-api-access-srlst DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-755 DeviceMajor:0 DeviceMinor:755 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-318 DeviceMajor:0 DeviceMinor:318 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1/userdata/shm DeviceMajor:0 DeviceMinor:1112 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-490 DeviceMajor:0 DeviceMinor:490 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-540 DeviceMajor:0 DeviceMinor:540 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-952 DeviceMajor:0 DeviceMinor:952 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-978 DeviceMajor:0 DeviceMinor:978 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d278ed70-786c-4b6c-9f04-f08ede704569/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:874 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0d4e6150-432c-4a11-b5a6-4d62dd701fc8/volumes/kubernetes.io~projected/kube-api-access-gn6w7 DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-335 DeviceMajor:0 DeviceMinor:335 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:93 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1173 DeviceMajor:0 DeviceMinor:1173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:04292570efd4dfe MacAddress:16:59:b7:71:08:e6 Speed:10000 Mtu:8900} {Name:16477c5f389a1fd MacAddress:b2:b3:ee:4f:71:cb Speed:10000 Mtu:8900} {Name:19ce38027ae9e3d MacAddress:8a:e3:01:c7:7f:fc Speed:10000 Mtu:8900} {Name:24c5ca2ad81d656 MacAddress:ca:ab:37:ec:a3:b1 Speed:10000 Mtu:8900} {Name:26154ef4eac655e MacAddress:42:9b:05:82:7b:ca Speed:10000 Mtu:8900} {Name:2b8d1e705972764 MacAddress:9e:9f:71:44:37:ed Speed:10000 Mtu:8900} {Name:2caee3a1af6093e MacAddress:16:86:a1:25:11:70 Speed:10000 Mtu:8900} {Name:2df89949d75d6b3 MacAddress:b2:b2:76:3d:8e:71 Speed:10000 Mtu:8900} {Name:2ea2257b817f7a5 MacAddress:42:fb:c7:e7:84:01 Speed:10000 Mtu:8900} {Name:339cc6449a00202 MacAddress:52:c7:bd:f2:43:ff Speed:10000 Mtu:8900} {Name:34d6500d42674d3 MacAddress:56:47:dd:a4:71:ba Speed:10000 Mtu:8900} {Name:35121dd9298456a MacAddress:26:e5:24:be:33:35 Speed:10000 Mtu:8900} {Name:37ae98838b10d25 MacAddress:2e:f3:b9:4c:44:1b Speed:10000 Mtu:8900} {Name:3ac90b7e141885c MacAddress:7a:22:b5:74:95:52 Speed:10000 Mtu:8900} {Name:3f750f4eaadd118 MacAddress:26:d6:db:e9:05:c1 Speed:10000 Mtu:8900} {Name:4a78398e61786f9 MacAddress:8a:20:ae:fe:3e:31 Speed:10000 Mtu:8900} {Name:4a7eca1172ea3bf MacAddress:86:03:03:e5:77:d5 Speed:10000 Mtu:8900} {Name:4a82c4d1f4dd070 MacAddress:4a:ba:79:90:39:9e Speed:10000 Mtu:8900} {Name:50afa0bafdfdadd MacAddress:ae:ba:77:d5:cd:29 Speed:10000 Mtu:8900} {Name:54c97778ddb9965 MacAddress:d6:12:25:78:0e:84 Speed:10000 Mtu:8900} {Name:57d04e10dfd8def MacAddress:be:bb:7c:a2:ce:1e Speed:10000 Mtu:8900} {Name:5f0a23a29ec1be2 MacAddress:8e:18:cd:2c:16:42 Speed:10000 Mtu:8900} {Name:6c434288b6cb191 MacAddress:a2:18:27:87:e3:af Speed:10000 Mtu:8900} {Name:6cd94cfc20909e4 MacAddress:4a:8f:26:15:21:a5 Speed:10000 Mtu:8900} {Name:6f8519eea623420 MacAddress:f2:73:55:50:b8:18 Speed:10000 Mtu:8900} {Name:7d309f2fa26be03 MacAddress:9e:10:0d:8c:a4:95 Speed:10000 Mtu:8900} {Name:7d8cfecb961af50 MacAddress:0e:8a:4a:5d:43:c8 Speed:10000 Mtu:8900} {Name:8f2613fc06a65ee MacAddress:be:16:84:bb:f4:93 Speed:10000 Mtu:8900} {Name:94e09a1dd75367c MacAddress:ba:74:3f:31:05:8e Speed:10000 Mtu:8900} {Name:9abb90df1fb36f7 MacAddress:7e:41:58:59:2c:14 Speed:10000 Mtu:8900} {Name:9fa79744aaaa405 MacAddress:0e:a3:d1:5c:e7:48 Speed:10000 Mtu:8900} {Name:a60b4d927a33db7 MacAddress:52:fa:c3:96:9b:ce Speed:10000 Mtu:8900} {Name:a628e92ac4f34b6 MacAddress:4a:2c:c1:ae:3f:9f Speed:10000 Mtu:8900} {Name:a98b9ab8613fc06 MacAddress:2a:84:a5:79:09:a1 Speed:10000 Mtu:8900} {Name:ab94900114a9122 MacAddress:56:61:fc:c4:3e:2c Speed:10000 Mtu:8900} {Name:b018527cc19e60b MacAddress:f2:64:a6:bf:d0:75 Speed:10000 Mtu:8900} {Name:b26809d00df2d88 MacAddress:ce:14:7d:37:f3:7d Speed:10000 Mtu:8900} {Name:b4d8f238475f0e2 MacAddress:56:1d:b7:6b:b1:90 Speed:10000 Mtu:8900} {Name:b53f7152f43c94f MacAddress:da:c1:9e:10:56:38 Speed:10000 Mtu:8900} {Name:b656a6f1c70c3ed MacAddress:ba:23:95:18:26:5f Speed:10000 Mtu:8900} {Name:b6d8f47788f03e5 MacAddress:c2:3a:ff:00:3a:4b Speed:10000 Mtu:8900} {Name:bdd5af34bfad236 MacAddress:6a:da:e7:86:bf:2d Speed:10000 Mtu:8900} {Name:be414d776e8ab0a MacAddress:0a:63:e7:e7:5e:63 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:ee:3d:4b:66:b5:5f Speed:0 Mtu:8900} {Name:c50effcf2f5fc89 MacAddress:52:62:52:25:a9:7b Speed:10000 Mtu:8900} {Name:cb94d42a599d6d7 MacAddress:a2:d4:31:f5:93:bb Speed:10000 Mtu:8900} {Name:cf319252fb389a3 MacAddress:a6:f9:b8:c0:f0:85 Speed:10000 Mtu:8900} {Name:d0d7ba4bdbd4575 MacAddress:b2:6d:ed:fb:a8:0a Speed:10000 Mtu:8900} {Name:d50b6a32815a2be MacAddress:32:7b:ff:ce:42:98 Speed:10000 Mtu:8900} {Name:d87d8f00b3827a6 MacAddress:12:be:43:ce:a0:81 Speed:10000 Mtu:8900} {Name:e2f745a0f2d01e6 MacAddress:3e:ec:0e:50:50:2b Speed:10000 Mtu:8900} {Name:e4f5650e90a0b9c MacAddress:d6:bb:53:5c:4d:76 Speed:10000 Mtu:8900} {Name:e8a8f6655cd50f0 MacAddress:b6:33:83:3d:2c:ea Speed:10000 Mtu:8900} {Name:eeedfbb568950a2 MacAddress:26:67:8a:14:5f:dc Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:7e:ba:68 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e0:00:01 Speed:-1 Mtu:9000} {Name:f8dd90e4b919a47 MacAddress:32:35:79:9f:1d:39 Speed:10000 Mtu:8900} {Name:fdd7225d6e1fca0 MacAddress:c6:08:30:9a:d8:ba Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:26:df:a4:ac:f6:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 01:18:59.310751 master-0 kubenswrapper[19170]: I0313 01:18:59.309668 19170 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 01:18:59.310751 master-0 kubenswrapper[19170]: I0313 01:18:59.309729 19170 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 01:18:59.310751 master-0 kubenswrapper[19170]: I0313 01:18:59.310106 19170 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 01:18:59.310751 master-0 kubenswrapper[19170]: I0313 01:18:59.310493 19170 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.310536 19170 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.310906 19170 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.310921 19170 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.310932 19170 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.310957 19170 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 01:18:59.311072 master-0 kubenswrapper[19170]: I0313 01:18:59.311019 19170 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:18:59.311232 master-0 kubenswrapper[19170]: I0313 01:18:59.311139 19170 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 01:18:59.311269 master-0 kubenswrapper[19170]: I0313 01:18:59.311243 19170 kubelet.go:418] "Attempting to sync node with API server" Mar 13 01:18:59.311269 master-0 kubenswrapper[19170]: I0313 01:18:59.311261 19170 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 01:18:59.311326 master-0 kubenswrapper[19170]: I0313 01:18:59.311279 19170 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 01:18:59.311326 master-0 kubenswrapper[19170]: I0313 01:18:59.311299 19170 kubelet.go:324] "Adding apiserver pod source" Mar 13 01:18:59.311326 master-0 kubenswrapper[19170]: I0313 01:18:59.311312 19170 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 01:18:59.313192 master-0 kubenswrapper[19170]: I0313 01:18:59.313153 19170 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 01:18:59.313321 master-0 kubenswrapper[19170]: I0313 01:18:59.313294 19170 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 01:18:59.313540 master-0 kubenswrapper[19170]: I0313 01:18:59.313512 19170 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 01:18:59.313655 master-0 kubenswrapper[19170]: I0313 01:18:59.313624 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 01:18:59.313655 master-0 kubenswrapper[19170]: I0313 01:18:59.313655 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313664 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313671 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313677 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313698 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313704 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313710 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313717 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313723 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313744 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 01:18:59.313754 master-0 kubenswrapper[19170]: I0313 01:18:59.313755 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 01:18:59.314111 master-0 kubenswrapper[19170]: I0313 01:18:59.313805 19170 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 01:18:59.315665 master-0 kubenswrapper[19170]: I0313 01:18:59.315189 19170 server.go:1280] "Started kubelet" Mar 13 01:18:59.316228 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 01:18:59.322832 master-0 kubenswrapper[19170]: I0313 01:18:59.322106 19170 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 01:18:59.322832 master-0 kubenswrapper[19170]: I0313 01:18:59.322171 19170 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 01:18:59.322832 master-0 kubenswrapper[19170]: I0313 01:18:59.322302 19170 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 01:18:59.323916 master-0 kubenswrapper[19170]: I0313 01:18:59.322984 19170 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 01:18:59.331667 master-0 kubenswrapper[19170]: I0313 01:18:59.331595 19170 server.go:449] "Adding debug handlers to kubelet server" Mar 13 01:18:59.334571 master-0 kubenswrapper[19170]: I0313 01:18:59.334528 19170 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 01:18:59.340106 master-0 kubenswrapper[19170]: I0313 01:18:59.340044 19170 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 01:18:59.343712 master-0 kubenswrapper[19170]: E0313 01:18:59.343478 19170 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.346744 19170 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.346814 19170 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.346896 19170 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 01:03:12 +0000 UTC, rotation deadline is 2026-03-13 18:42:50.682652116 +0000 UTC Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.346919 19170 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h23m51.335735989s for next certificate rotation Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.347143 19170 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.347152 19170 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 01:18:59.352980 master-0 kubenswrapper[19170]: I0313 01:18:59.347318 19170 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376216 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bfc49699-9428-4bff-804d-da0e60551759" volumeName="kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376272 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2b5ad07-fa01-4330-9dce-6da3444657ab" volumeName="kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376283 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85149f21-7ba8-4891-82ef-0fef3d5d7863" volumeName="kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376293 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" volumeName="kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376302 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a5ab1d5-dabd-45e7-a688-71a282f61e67" volumeName="kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376311 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac2a4c90-32db-4464-8c47-acbcafbcd5d0" volumeName="kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376320 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376328 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d81bcb58-efe3-4577-8e88-67f92c645f6f" volumeName="kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376338 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376348 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33dfdc31-54a4-4249-99ae-a15180514659" volumeName="kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376358 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f9728b4-4e1e-4165-a276-3daa00e95839" volumeName="kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376366 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376374 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376383 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376395 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376402 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9462e2e-728d-4076-a876-31dbbd637581" volumeName="kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376423 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77fd9062-0f7d-4255-92ca-7e4325daeddd" volumeName="kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376433 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85149f21-7ba8-4891-82ef-0fef3d5d7863" volumeName="kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376443 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376452 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376462 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca2fa86b-a966-49dc-8577-d2b54b111d14" volumeName="kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376470 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376479 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac2a4c90-32db-4464-8c47-acbcafbcd5d0" volumeName="kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376487 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376508 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="631f5719-2083-4c99-92cb-2ddc04022d86" volumeName="kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376516 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd7cca05-3da7-42cf-af64-6e94050e58c0" volumeName="kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376527 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376537 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376546 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33dfdc31-54a4-4249-99ae-a15180514659" volumeName="kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376554 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cada5bf2-e208-4fd8-bdf5-de8cad31a665" volumeName="kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376563 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="48375ae2-d4b4-4db4-b832-3e3db1834fb9" volumeName="kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376570 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376579 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" volumeName="kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376592 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c24e17c-8bd9-4c23-9876-6f31c9da5cd1" volumeName="kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376601 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" volumeName="kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376610 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376619 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="631f5719-2083-4c99-92cb-2ddc04022d86" volumeName="kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376642 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376653 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46662e51-44af-4732-83a1-9509a579b373" volumeName="kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376662 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bfc49699-9428-4bff-804d-da0e60551759" volumeName="kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376671 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e68ab3cb-c372-45d9-a758-beaf4c213714" volumeName="kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376679 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376689 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376696 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" volumeName="kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376704 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376738 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c88187c-d011-4043-a6d3-4a8a7ec4e204" volumeName="kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376750 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d2e7338-a6d6-4872-ab72-a4e631075ab3" volumeName="kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376760 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85149f21-7ba8-4891-82ef-0fef3d5d7863" volumeName="kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376773 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2b5ad07-fa01-4330-9dce-6da3444657ab" volumeName="kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376786 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376795 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c4c579b-0643-47ac-a729-017c326b0ecc" volumeName="kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376803 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376817 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="631f5719-2083-4c99-92cb-2ddc04022d86" volumeName="kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376826 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376834 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376845 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376859 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9462e2e-728d-4076-a876-31dbbd637581" volumeName="kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376870 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376880 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d313ee4-3bb9-44a9-ad80-8e00540ef1e7" volumeName="kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376891 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da44d750-31e5-46f4-b3ef-dd4384c22aaf" volumeName="kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376901 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376913 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0671fdd0-b358-40f9-ae49-2c5a9004edb3" volumeName="kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376927 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46662e51-44af-4732-83a1-9509a579b373" volumeName="kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376937 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376948 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33dfdc31-54a4-4249-99ae-a15180514659" volumeName="kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376963 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a5ab1d5-dabd-45e7-a688-71a282f61e67" volumeName="kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.376989 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377002 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377013 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377045 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d23bbaec-b635-4649-b26e-2829f32d21f0" volumeName="kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377056 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2679c6e1-11c1-450c-b03a-30d7ee59ff6f" volumeName="kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377066 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d2cd80-23b9-426d-a7ac-1daa27668a47" volumeName="kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377077 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5956ebf-01e4-4d4c-ae6d-b0995905c6d3" volumeName="kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377087 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377097 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377108 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377117 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71" volumeName="kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377126 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377136 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e6150-432c-4a11-b5a6-4d62dd701fc8" volumeName="kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377147 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" volumeName="kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377158 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebf338e6-9725-47d9-8c7f-adbf11a44406" volumeName="kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377169 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377179 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4c5174b9-ca9e-4917-ab3a-ca403ce4f017" volumeName="kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377187 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377196 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377207 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377217 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377228 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58405741-598c-4bf5-bbc8-1ca8e3f10995" volumeName="kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377239 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377251 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377261 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377272 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5956ebf-01e4-4d4c-ae6d-b0995905c6d3" volumeName="kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377283 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="edda0d03-fdb2-4130-8f73-8057efd5815c" volumeName="kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377296 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13fac7b0-ce55-467d-9d0c-6a122d87cb3c" volumeName="kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377307 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" volumeName="kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377317 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="93871019-3d0c-4081-9afe-19b6dd108ec6" volumeName="kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377328 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377346 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377357 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377368 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377378 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377390 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" volumeName="kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377402 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae44526f-5858-42a0-ba77-3a22f171456f" volumeName="kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377414 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2ce47660-f7cc-4669-a00d-83422f0f6d55" volumeName="kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377432 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" volumeName="kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377444 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca2fa86b-a966-49dc-8577-d2b54b111d14" volumeName="kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377457 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d278ed70-786c-4b6c-9f04-f08ede704569" volumeName="kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377468 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2b5ad07-fa01-4330-9dce-6da3444657ab" volumeName="kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377482 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377495 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c4c579b-0643-47ac-a729-017c326b0ecc" volumeName="kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377507 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0d4e6150-432c-4a11-b5a6-4d62dd701fc8" volumeName="kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377519 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377531 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71" volumeName="kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377542 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377553 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" volumeName="kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377564 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4738c93d-62e6-44ce-a289-e646b9302e71" volumeName="kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377575 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="631f5719-2083-4c99-92cb-2ddc04022d86" volumeName="kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377586 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377599 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377610 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377624 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377671 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377685 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33dfdc31-54a4-4249-99ae-a15180514659" volumeName="kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377697 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90e6e63d-3cf2-4bb5-883f-6219a0b52c3a" volumeName="kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377708 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da44d750-31e5-46f4-b3ef-dd4384c22aaf" volumeName="kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377720 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="edda0d03-fdb2-4130-8f73-8057efd5815c" volumeName="kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377730 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77fd9062-0f7d-4255-92ca-7e4325daeddd" volumeName="kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377741 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="93871019-3d0c-4081-9afe-19b6dd108ec6" volumeName="kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377752 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377765 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0671fdd0-b358-40f9-ae49-2c5a9004edb3" volumeName="kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377776 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64477504-5cb6-42dc-a7eb-662981daec4a" volumeName="kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377786 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377795 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd7cca05-3da7-42cf-af64-6e94050e58c0" volumeName="kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377805 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd7cca05-3da7-42cf-af64-6e94050e58c0" volumeName="kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377814 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377824 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377836 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58405741-598c-4bf5-bbc8-1ca8e3f10995" volumeName="kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377846 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377856 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="93871019-3d0c-4081-9afe-19b6dd108ec6" volumeName="kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377864 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2b5ad07-fa01-4330-9dce-6da3444657ab" volumeName="kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377874 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" volumeName="kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377884 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c24e17c-8bd9-4c23-9876-6f31c9da5cd1" volumeName="kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377893 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f35cc1e-3376-4dbd-b215-2a32bf62cc71" volumeName="kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377903 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377914 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377926 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77fd9062-0f7d-4255-92ca-7e4325daeddd" volumeName="kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377937 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377947 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d23bbaec-b635-4649-b26e-2829f32d21f0" volumeName="kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377957 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377968 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377978 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebf338e6-9725-47d9-8c7f-adbf11a44406" volumeName="kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.377989 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2ce47660-f7cc-4669-a00d-83422f0f6d55" volumeName="kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378001 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9462e2e-728d-4076-a876-31dbbd637581" volumeName="kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378011 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd7cca05-3da7-42cf-af64-6e94050e58c0" volumeName="kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378022 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378032 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378044 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2ce47660-f7cc-4669-a00d-83422f0f6d55" volumeName="kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378054 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378064 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58405741-598c-4bf5-bbc8-1ca8e3f10995" volumeName="kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378075 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378084 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a5ab1d5-dabd-45e7-a688-71a282f61e67" volumeName="kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378096 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71" volumeName="kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378106 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f35cc1e-3376-4dbd-b215-2a32bf62cc71" volumeName="kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378115 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" volumeName="kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378124 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378134 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378143 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5956ebf-01e4-4d4c-ae6d-b0995905c6d3" volumeName="kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378152 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="edda0d03-fdb2-4130-8f73-8057efd5815c" volumeName="kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378162 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e68ab3cb-c372-45d9-a758-beaf4c213714" volumeName="kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378171 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f9728b4-4e1e-4165-a276-3daa00e95839" volumeName="kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378179 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d4e785-6663-417d-b380-6905773613c8" volumeName="kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378189 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d2cd80-23b9-426d-a7ac-1daa27668a47" volumeName="kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378199 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert" seLinuxMountContext="" Mar 13 01:18:59.378199 master-0 kubenswrapper[19170]: I0313 01:18:59.378209 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c4c579b-0643-47ac-a729-017c326b0ecc" volumeName="kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378221 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58035e42-37d8-48f6-9861-9b4ce6014119" volumeName="kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378231 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90e6e63d-3cf2-4bb5-883f-6219a0b52c3a" volumeName="kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378241 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d56480e0-0885-41e5-a1fc-931a068fbadb" volumeName="kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378251 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="039acb44-a9b3-4ad6-a091-be4d18edc34f" volumeName="kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378261 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" volumeName="kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378271 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22587300-2448-4862-9fd8-68197d17a9f2" volumeName="kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378280 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" volumeName="kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378291 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378301 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2937cbe2-3125-4c3f-96f8-2febeb5942cc" volumeName="kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378311 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac2a4c90-32db-4464-8c47-acbcafbcd5d0" volumeName="kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378322 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="631f5719-2083-4c99-92cb-2ddc04022d86" volumeName="kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378333 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" volumeName="kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378344 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378353 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378364 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378375 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378385 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" volumeName="kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378395 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="13fac7b0-ce55-467d-9d0c-6a122d87cb3c" volumeName="kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378405 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9462e2e-728d-4076-a876-31dbbd637581" volumeName="kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378417 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378427 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" volumeName="kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378437 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="738ebdcd-b78b-495a-b8f2-84af11a7d35c" volumeName="kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378446 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378455 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cada5bf2-e208-4fd8-bdf5-de8cad31a665" volumeName="kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378465 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d278ed70-786c-4b6c-9f04-f08ede704569" volumeName="kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378474 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" volumeName="kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378483 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f9728b4-4e1e-4165-a276-3daa00e95839" volumeName="kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378492 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378501 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" volumeName="kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378509 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77fd9062-0f7d-4255-92ca-7e4325daeddd" volumeName="kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378519 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="916d9fc9-388b-4506-a17c-36a7f626356a" volumeName="kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378528 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0671fdd0-b358-40f9-ae49-2c5a9004edb3" volumeName="kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378537 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0671fdd0-b358-40f9-ae49-2c5a9004edb3" volumeName="kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378546 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d278ed70-786c-4b6c-9f04-f08ede704569" volumeName="kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378561 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c4c579b-0643-47ac-a729-017c326b0ecc" volumeName="kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378572 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378582 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" volumeName="kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378592 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" volumeName="kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378602 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0671fdd0-b358-40f9-ae49-2c5a9004edb3" volumeName="kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378611 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d2cd80-23b9-426d-a7ac-1daa27668a47" volumeName="kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378621 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8c6bf2d5-1881-4b63-b247-7e7426707fa1" volumeName="kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378678 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d23bbaec-b635-4649-b26e-2829f32d21f0" volumeName="kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378696 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1308fba1-a50d-48b3-b272-7bef44727b7f" volumeName="kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378708 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" volumeName="kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378721 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21cbea73-f779-43e4-b5ba-d6fa06275d34" volumeName="kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378734 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77fd9062-0f7d-4255-92ca-7e4325daeddd" volumeName="kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378748 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" volumeName="kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378761 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4976e608-07a0-4cef-8fdd-7cec3324b4b5" volumeName="kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378774 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" volumeName="kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378787 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6c88187c-d011-4043-a6d3-4a8a7ec4e204" volumeName="kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378799 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" volumeName="kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378814 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da44d750-31e5-46f4-b3ef-dd4384c22aaf" volumeName="kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378828 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2ce47660-f7cc-4669-a00d-83422f0f6d55" volumeName="kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378841 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378853 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2679c6e1-11c1-450c-b03a-30d7ee59ff6f" volumeName="kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378866 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d5456c8b-3c98-4824-8700-a04e9c12fb2e" volumeName="kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378883 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378895 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" volumeName="kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378907 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="90e6e63d-3cf2-4bb5-883f-6219a0b52c3a" volumeName="kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378919 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f97819d0-2840-4352-a435-19ef1a8c22c9" volumeName="kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378932 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="30f7537e-93ed-466b-ba24-78141d004b2f" volumeName="kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378948 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378960 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c24e17c-8bd9-4c23-9876-6f31c9da5cd1" volumeName="kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378971 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9b713fb-64ce-4a01-951c-1f31df62e1ae" volumeName="kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378983 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c24e17c-8bd9-4c23-9876-6f31c9da5cd1" volumeName="kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.378995 19170 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57eb2020-1560-4352-8b86-76db59de933a" volumeName="kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379007 19170 reconstruct.go:97] "Volume reconstruction finished" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379016 19170 reconciler.go:26] "Reconciler: start to sync state" Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.350493 19170 factory.go:55] Registering systemd factory Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379505 19170 factory.go:221] Registration of the systemd container factory successfully Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379843 19170 factory.go:153] Registering CRI-O factory Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379860 19170 factory.go:221] Registration of the crio container factory successfully Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379952 19170 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379975 19170 factory.go:103] Registering Raw factory Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.379993 19170 manager.go:1196] Started watching for new ooms in manager Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.380450 19170 manager.go:319] Starting recovery of all containers Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.380875 19170 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 01:18:59.392507 master-0 kubenswrapper[19170]: I0313 01:18:59.385160 19170 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 01:18:59.416278 master-0 kubenswrapper[19170]: I0313 01:18:59.416198 19170 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 01:18:59.418061 master-0 kubenswrapper[19170]: I0313 01:18:59.418014 19170 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 01:18:59.418119 master-0 kubenswrapper[19170]: I0313 01:18:59.418096 19170 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 01:18:59.418184 master-0 kubenswrapper[19170]: I0313 01:18:59.418122 19170 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 01:18:59.418286 master-0 kubenswrapper[19170]: E0313 01:18:59.418245 19170 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 01:18:59.420127 master-0 kubenswrapper[19170]: I0313 01:18:59.420098 19170 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 01:18:59.436753 master-0 kubenswrapper[19170]: I0313 01:18:59.436702 19170 generic.go:334] "Generic (PLEG): container finished" podID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerID="e8004c606441f404a88824ad0f391a1736bc2bfc8d968e181bc7750c3498d909" exitCode=0 Mar 13 01:18:59.443405 master-0 kubenswrapper[19170]: I0313 01:18:59.441267 19170 generic.go:334] "Generic (PLEG): container finished" podID="95d4e785-6663-417d-b380-6905773613c8" containerID="4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24" exitCode=0 Mar 13 01:18:59.444876 master-0 kubenswrapper[19170]: I0313 01:18:59.444843 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 01:18:59.445200 master-0 kubenswrapper[19170]: I0313 01:18:59.445169 19170 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5" exitCode=1 Mar 13 01:18:59.445200 master-0 kubenswrapper[19170]: I0313 01:18:59.445193 19170 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4" exitCode=0 Mar 13 01:18:59.464768 master-0 kubenswrapper[19170]: I0313 01:18:59.464707 19170 generic.go:334] "Generic (PLEG): container finished" podID="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" containerID="28ba9aa037bbe7233c4f1b03cbf6e53b0a8d15d86593c5a55e6b347f9655f21f" exitCode=0 Mar 13 01:18:59.464768 master-0 kubenswrapper[19170]: I0313 01:18:59.464742 19170 generic.go:334] "Generic (PLEG): container finished" podID="a561a1d1-b20f-45fd-9e0c-ee4399a1d31b" containerID="1516e86a1559523ee925b4efda9712253b30e34e1c1002f5e1e5d03874fe6d41" exitCode=0 Mar 13 01:18:59.467256 master-0 kubenswrapper[19170]: I0313 01:18:59.467193 19170 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="ae1c74ac713339ebe951cea485ddb317986dccb644eb4d3021ce0d21c709fe41" exitCode=0 Mar 13 01:18:59.467256 master-0 kubenswrapper[19170]: I0313 01:18:59.467221 19170 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182" exitCode=0 Mar 13 01:18:59.482775 master-0 kubenswrapper[19170]: I0313 01:18:59.482251 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-bxqp2_f9b713fb-64ce-4a01-951c-1f31df62e1ae/authentication-operator/1.log" Mar 13 01:18:59.482775 master-0 kubenswrapper[19170]: I0313 01:18:59.482292 19170 generic.go:334] "Generic (PLEG): container finished" podID="f9b713fb-64ce-4a01-951c-1f31df62e1ae" containerID="70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d" exitCode=255 Mar 13 01:18:59.502662 master-0 kubenswrapper[19170]: I0313 01:18:59.502536 19170 generic.go:334] "Generic (PLEG): container finished" podID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerID="a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05" exitCode=0 Mar 13 01:18:59.505271 master-0 kubenswrapper[19170]: I0313 01:18:59.505224 19170 generic.go:334] "Generic (PLEG): container finished" podID="bfc49699-9428-4bff-804d-da0e60551759" containerID="ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd" exitCode=0 Mar 13 01:18:59.518434 master-0 kubenswrapper[19170]: E0313 01:18:59.518402 19170 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 01:18:59.535727 master-0 kubenswrapper[19170]: I0313 01:18:59.535688 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" exitCode=0 Mar 13 01:18:59.536532 master-0 kubenswrapper[19170]: I0313 01:18:59.536514 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" exitCode=0 Mar 13 01:18:59.536752 master-0 kubenswrapper[19170]: I0313 01:18:59.536734 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" exitCode=0 Mar 13 01:18:59.545788 master-0 kubenswrapper[19170]: I0313 01:18:59.545742 19170 generic.go:334] "Generic (PLEG): container finished" podID="916d9fc9-388b-4506-a17c-36a7f626356a" containerID="a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286" exitCode=0 Mar 13 01:18:59.548497 master-0 kubenswrapper[19170]: I0313 01:18:59.548469 19170 generic.go:334] "Generic (PLEG): container finished" podID="57eb2020-1560-4352-8b86-76db59de933a" containerID="408fa86e57d7c0ed7566e66a9206de42b73c3a8d5d5b9b39423211e50e66920f" exitCode=0 Mar 13 01:18:59.554441 master-0 kubenswrapper[19170]: I0313 01:18:59.554422 19170 generic.go:334] "Generic (PLEG): container finished" podID="21cbea73-f779-43e4-b5ba-d6fa06275d34" containerID="d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649" exitCode=0 Mar 13 01:18:59.557426 master-0 kubenswrapper[19170]: I0313 01:18:59.557413 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-bqmmf_d56480e0-0885-41e5-a1fc-931a068fbadb/openshift-config-operator/2.log" Mar 13 01:18:59.557869 master-0 kubenswrapper[19170]: I0313 01:18:59.557853 19170 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600" exitCode=255 Mar 13 01:18:59.557945 master-0 kubenswrapper[19170]: I0313 01:18:59.557926 19170 generic.go:334] "Generic (PLEG): container finished" podID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerID="77742db3e18b710fed8057a5ff63f6e99d45794674fb37f85d739e62dd3a751e" exitCode=0 Mar 13 01:18:59.561981 master-0 kubenswrapper[19170]: I0313 01:18:59.561949 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624" exitCode=0 Mar 13 01:18:59.564109 master-0 kubenswrapper[19170]: I0313 01:18:59.564091 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-tq7n6_2bd94289-7109-4419-9a51-bd289082b9f5/multus-admission-controller/0.log" Mar 13 01:18:59.564171 master-0 kubenswrapper[19170]: I0313 01:18:59.564119 19170 generic.go:334] "Generic (PLEG): container finished" podID="2bd94289-7109-4419-9a51-bd289082b9f5" containerID="2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e" exitCode=0 Mar 13 01:18:59.564171 master-0 kubenswrapper[19170]: I0313 01:18:59.564130 19170 generic.go:334] "Generic (PLEG): container finished" podID="2bd94289-7109-4419-9a51-bd289082b9f5" containerID="cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41" exitCode=137 Mar 13 01:18:59.568960 master-0 kubenswrapper[19170]: I0313 01:18:59.568936 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/0.log" Mar 13 01:18:59.569069 master-0 kubenswrapper[19170]: I0313 01:18:59.569049 19170 generic.go:334] "Generic (PLEG): container finished" podID="8c6bf2d5-1881-4b63-b247-7e7426707fa1" containerID="4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d" exitCode=1 Mar 13 01:18:59.571040 master-0 kubenswrapper[19170]: I0313 01:18:59.571020 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/0.log" Mar 13 01:18:59.571356 master-0 kubenswrapper[19170]: I0313 01:18:59.571340 19170 generic.go:334] "Generic (PLEG): container finished" podID="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" containerID="8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab" exitCode=1 Mar 13 01:18:59.587607 master-0 kubenswrapper[19170]: I0313 01:18:59.587369 19170 generic.go:334] "Generic (PLEG): container finished" podID="58035e42-37d8-48f6-9861-9b4ce6014119" containerID="662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9" exitCode=0 Mar 13 01:18:59.606703 master-0 kubenswrapper[19170]: I0313 01:18:59.604899 19170 generic.go:334] "Generic (PLEG): container finished" podID="22587300-2448-4862-9fd8-68197d17a9f2" containerID="0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8" exitCode=0 Mar 13 01:18:59.625888 master-0 kubenswrapper[19170]: I0313 01:18:59.620721 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-6-master-0_0b3a64f4-e94f-4916-8c91-a255d987735d/installer/0.log" Mar 13 01:18:59.625888 master-0 kubenswrapper[19170]: I0313 01:18:59.620757 19170 generic.go:334] "Generic (PLEG): container finished" podID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerID="6b91817f5b3f7a0651a092d44f47916346942c3944860ca84cb9f688537c7ce3" exitCode=1 Mar 13 01:18:59.631810 master-0 kubenswrapper[19170]: I0313 01:18:59.631745 19170 generic.go:334] "Generic (PLEG): container finished" podID="3f9728b4-4e1e-4165-a276-3daa00e95839" containerID="e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0" exitCode=0 Mar 13 01:18:59.631810 master-0 kubenswrapper[19170]: I0313 01:18:59.631803 19170 generic.go:334] "Generic (PLEG): container finished" podID="3f9728b4-4e1e-4165-a276-3daa00e95839" containerID="aa38a63f384b7da874941350b67401ef32b738eda3d617175799f0520f0661d5" exitCode=0 Mar 13 01:18:59.639890 master-0 kubenswrapper[19170]: I0313 01:18:59.639855 19170 generic.go:334] "Generic (PLEG): container finished" podID="d23bbaec-b635-4649-b26e-2829f32d21f0" containerID="2a303738597489cb37ad3f267fff17f1f61cb2e88b4598e4de81e45d9cdb8d55" exitCode=0 Mar 13 01:18:59.639890 master-0 kubenswrapper[19170]: I0313 01:18:59.639885 19170 generic.go:334] "Generic (PLEG): container finished" podID="d23bbaec-b635-4649-b26e-2829f32d21f0" containerID="682be9d920ceec9bf69789866cf8eedebb71157fd9c01901ddaedd2fde2be709" exitCode=0 Mar 13 01:18:59.645385 master-0 kubenswrapper[19170]: I0313 01:18:59.645354 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/1.log" Mar 13 01:18:59.645494 master-0 kubenswrapper[19170]: I0313 01:18:59.645404 19170 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869" exitCode=1 Mar 13 01:18:59.652084 master-0 kubenswrapper[19170]: I0313 01:18:59.652040 19170 generic.go:334] "Generic (PLEG): container finished" podID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerID="90f1b6677182d94c10ac2334ab8747391e467bc270fab2bd3e7e7b1c8a3cd1c7" exitCode=0 Mar 13 01:18:59.654054 master-0 kubenswrapper[19170]: I0313 01:18:59.654017 19170 generic.go:334] "Generic (PLEG): container finished" podID="b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35" containerID="2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663" exitCode=0 Mar 13 01:18:59.664772 master-0 kubenswrapper[19170]: I0313 01:18:59.664302 19170 generic.go:334] "Generic (PLEG): container finished" podID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerID="8803ea2eb582c5693311e889e291d05e3059cc337f89e85079fab8e693f3beb8" exitCode=0 Mar 13 01:18:59.667386 master-0 kubenswrapper[19170]: I0313 01:18:59.667351 19170 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43" exitCode=0 Mar 13 01:18:59.667386 master-0 kubenswrapper[19170]: I0313 01:18:59.667380 19170 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="4d990b5e894ae9b6e30a48b23a8c0721805ad4a05730ef2a8a80e7f39e6f738b" exitCode=0 Mar 13 01:18:59.667386 master-0 kubenswrapper[19170]: I0313 01:18:59.667387 19170 generic.go:334] "Generic (PLEG): container finished" podID="61fb4b86-f978-4ae1-80bc-18d2f386cbc2" containerID="e25bc60853f66a5d6c7e1021efdd8403103d53c529624ce5e308b8d3dfb44aaf" exitCode=0 Mar 13 01:18:59.687395 master-0 kubenswrapper[19170]: I0313 01:18:59.687357 19170 generic.go:334] "Generic (PLEG): container finished" podID="4edb3e1a-9082-4fc2-ae6f-99d49c078a34" containerID="d00fb05f88d59786ab92f821f00f790d94c0eeac3280854affdf40137d7e87d0" exitCode=0 Mar 13 01:18:59.697208 master-0 kubenswrapper[19170]: I0313 01:18:59.697160 19170 generic.go:334] "Generic (PLEG): container finished" podID="690f916b-6f87-42d9-8168-392a9177bee9" containerID="f3c19acecbccf7bd6716e7d44a9b0fc9bb63ca007ca5d04b416b934ef2cbe52c" exitCode=0 Mar 13 01:18:59.703046 master-0 kubenswrapper[19170]: I0313 01:18:59.701311 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/0.log" Mar 13 01:18:59.703046 master-0 kubenswrapper[19170]: I0313 01:18:59.701341 19170 generic.go:334] "Generic (PLEG): container finished" podID="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" containerID="5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b" exitCode=1 Mar 13 01:18:59.703998 master-0 kubenswrapper[19170]: I0313 01:18:59.703977 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_47806631-9d60-4658-832d-f160f93f42ea/installer/0.log" Mar 13 01:18:59.704073 master-0 kubenswrapper[19170]: I0313 01:18:59.704005 19170 generic.go:334] "Generic (PLEG): container finished" podID="47806631-9d60-4658-832d-f160f93f42ea" containerID="71b98806c78a21853872bf216fdc04280da7bf4d8777bb06b2a922047a6a9e8c" exitCode=1 Mar 13 01:18:59.719713 master-0 kubenswrapper[19170]: E0313 01:18:59.718780 19170 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 01:18:59.720063 master-0 kubenswrapper[19170]: I0313 01:18:59.720042 19170 generic.go:334] "Generic (PLEG): container finished" podID="738ebdcd-b78b-495a-b8f2-84af11a7d35c" containerID="ec2e5e0e9f2f0d0bb48be3bbf455c597567e5ab58c590a1a48ffa8bb7da7c8c1" exitCode=0 Mar 13 01:18:59.727883 master-0 kubenswrapper[19170]: I0313 01:18:59.727862 19170 generic.go:334] "Generic (PLEG): container finished" podID="1308fba1-a50d-48b3-b272-7bef44727b7f" containerID="0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad" exitCode=0 Mar 13 01:18:59.732522 master-0 kubenswrapper[19170]: I0313 01:18:59.732495 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_29e096ea-ca9d-477b-b0aa-1d10244d51d9/installer/0.log" Mar 13 01:18:59.732583 master-0 kubenswrapper[19170]: I0313 01:18:59.732532 19170 generic.go:334] "Generic (PLEG): container finished" podID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerID="167e9a0418be9c64d38402cc471015911f91f7d101628f86049fb49485d8495a" exitCode=1 Mar 13 01:18:59.736241 master-0 kubenswrapper[19170]: I0313 01:18:59.736222 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-sslxh_039acb44-a9b3-4ad6-a091-be4d18edc34f/openshift-controller-manager-operator/0.log" Mar 13 01:18:59.737725 master-0 kubenswrapper[19170]: I0313 01:18:59.737706 19170 generic.go:334] "Generic (PLEG): container finished" podID="039acb44-a9b3-4ad6-a091-be4d18edc34f" containerID="757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a" exitCode=1 Mar 13 01:18:59.743154 master-0 kubenswrapper[19170]: I0313 01:18:59.743080 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-kdn2l_70c097a1-90d9-4344-b0ae-5a59ec2ad8ad/ingress-operator/0.log" Mar 13 01:18:59.743227 master-0 kubenswrapper[19170]: I0313 01:18:59.743147 19170 generic.go:334] "Generic (PLEG): container finished" podID="70c097a1-90d9-4344-b0ae-5a59ec2ad8ad" containerID="bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268" exitCode=1 Mar 13 01:18:59.774937 master-0 kubenswrapper[19170]: I0313 01:18:59.774830 19170 generic.go:334] "Generic (PLEG): container finished" podID="04470d64-c6eb-4a62-ae75-2a1d3dfdd53a" containerID="b73ff7b10ef47505ecf38b484d44b39c71e1d9b7b2e9d15b9f215185c43203db" exitCode=0 Mar 13 01:18:59.781470 master-0 kubenswrapper[19170]: I0313 01:18:59.781435 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/0.log" Mar 13 01:18:59.781995 master-0 kubenswrapper[19170]: I0313 01:18:59.781956 19170 generic.go:334] "Generic (PLEG): container finished" podID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerID="123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9" exitCode=1 Mar 13 01:18:59.785085 master-0 kubenswrapper[19170]: I0313 01:18:59.784598 19170 generic.go:334] "Generic (PLEG): container finished" podID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerID="7c07bc771c953fa9d34f82960a8b9fd12b63e9a86c930f999ffe77b37e0a74ef" exitCode=0 Mar 13 01:18:59.788774 master-0 kubenswrapper[19170]: I0313 01:18:59.788619 19170 generic.go:334] "Generic (PLEG): container finished" podID="631f5719-2083-4c99-92cb-2ddc04022d86" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" exitCode=0 Mar 13 01:18:59.791307 master-0 kubenswrapper[19170]: I0313 01:18:59.791274 19170 generic.go:334] "Generic (PLEG): container finished" podID="ca2fa86b-a966-49dc-8577-d2b54b111d14" containerID="18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809081 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f64c75ed084248ad496cb98f6981ac7735f162ce7e7121ef5597b4e213d85ac5" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809118 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809125 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="ac0c969b95b64c22e84de07c2976566813a316f1d691a27df3a1f4621768e238" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809133 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="569d90d03ceda29b5f6ff80b99725d90e6a4f9724473ba5d3146ac49efbbe232" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809140 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="03b1433799f1c9507de93fbd689d37d0b962300c0b8274b036071bcf3cc09941" exitCode=0 Mar 13 01:18:59.809131 master-0 kubenswrapper[19170]: I0313 01:18:59.809148 19170 generic.go:334] "Generic (PLEG): container finished" podID="4738c93d-62e6-44ce-a289-e646b9302e71" containerID="f1844314bd4c14c44c294275c228ee201df2f8be5daa877db9d32b69fb506d82" exitCode=0 Mar 13 01:18:59.820819 master-0 kubenswrapper[19170]: I0313 01:18:59.820779 19170 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9" exitCode=1 Mar 13 01:18:59.827780 master-0 kubenswrapper[19170]: I0313 01:18:59.827723 19170 generic.go:334] "Generic (PLEG): container finished" podID="486c7e33-3dd8-4a98-87e3-8216ee2e05ef" containerID="451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b" exitCode=0 Mar 13 01:18:59.830555 master-0 kubenswrapper[19170]: I0313 01:18:59.830518 19170 generic.go:334] "Generic (PLEG): container finished" podID="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" containerID="c90d5f8b2e62149395ea03f276ff599cb9a6c656f64c9d31908bac0077615d31" exitCode=0 Mar 13 01:18:59.830555 master-0 kubenswrapper[19170]: I0313 01:18:59.830548 19170 generic.go:334] "Generic (PLEG): container finished" podID="a9c7c6a4-4f5b-4807-932c-1b0f53ceed22" containerID="76c90bf296df85a0e3ed051135157af3e8cd81617b8acdff6c18242b0b74f386" exitCode=0 Mar 13 01:19:00.069016 master-0 kubenswrapper[19170]: I0313 01:19:00.068930 19170 manager.go:324] Recovery completed Mar 13 01:19:00.119071 master-0 kubenswrapper[19170]: E0313 01:19:00.119004 19170 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 01:19:00.172709 master-0 kubenswrapper[19170]: I0313 01:19:00.172667 19170 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 01:19:00.172709 master-0 kubenswrapper[19170]: I0313 01:19:00.172694 19170 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 01:19:00.172709 master-0 kubenswrapper[19170]: I0313 01:19:00.172727 19170 state_mem.go:36] "Initialized new in-memory state store" Mar 13 01:19:00.173012 master-0 kubenswrapper[19170]: I0313 01:19:00.172897 19170 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 01:19:00.173012 master-0 kubenswrapper[19170]: I0313 01:19:00.172908 19170 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 01:19:00.173012 master-0 kubenswrapper[19170]: I0313 01:19:00.172926 19170 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 13 01:19:00.173012 master-0 kubenswrapper[19170]: I0313 01:19:00.172932 19170 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 13 01:19:00.173012 master-0 kubenswrapper[19170]: I0313 01:19:00.172939 19170 policy_none.go:49] "None policy: Start" Mar 13 01:19:00.176702 master-0 kubenswrapper[19170]: I0313 01:19:00.176623 19170 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 01:19:00.176702 master-0 kubenswrapper[19170]: I0313 01:19:00.176689 19170 state_mem.go:35] "Initializing new in-memory state store" Mar 13 01:19:00.176939 master-0 kubenswrapper[19170]: I0313 01:19:00.176897 19170 state_mem.go:75] "Updated machine memory state" Mar 13 01:19:00.176939 master-0 kubenswrapper[19170]: I0313 01:19:00.176910 19170 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 13 01:19:00.201132 master-0 kubenswrapper[19170]: I0313 01:19:00.201090 19170 manager.go:334] "Starting Device Plugin manager" Mar 13 01:19:00.201295 master-0 kubenswrapper[19170]: I0313 01:19:00.201168 19170 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 01:19:00.201295 master-0 kubenswrapper[19170]: I0313 01:19:00.201196 19170 server.go:79] "Starting device plugin registration server" Mar 13 01:19:00.201587 master-0 kubenswrapper[19170]: I0313 01:19:00.201567 19170 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 01:19:00.201662 master-0 kubenswrapper[19170]: I0313 01:19:00.201584 19170 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 01:19:00.202005 master-0 kubenswrapper[19170]: I0313 01:19:00.201970 19170 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 01:19:00.202464 master-0 kubenswrapper[19170]: I0313 01:19:00.202314 19170 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 01:19:00.202464 master-0 kubenswrapper[19170]: I0313 01:19:00.202466 19170 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 01:19:00.302765 master-0 kubenswrapper[19170]: I0313 01:19:00.302684 19170 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:19:00.307879 master-0 kubenswrapper[19170]: I0313 01:19:00.307839 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:19:00.307879 master-0 kubenswrapper[19170]: I0313 01:19:00.307878 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:19:00.307879 master-0 kubenswrapper[19170]: I0313 01:19:00.307886 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:19:00.308081 master-0 kubenswrapper[19170]: I0313 01:19:00.307971 19170 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:19:00.313653 master-0 kubenswrapper[19170]: I0313 01:19:00.313148 19170 apiserver.go:52] "Watching apiserver" Mar 13 01:19:00.357717 master-0 kubenswrapper[19170]: E0313 01:19:00.352114 19170 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 13 01:19:00.379666 master-0 kubenswrapper[19170]: I0313 01:19:00.372239 19170 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 01:19:00.552426 master-0 kubenswrapper[19170]: I0313 01:19:00.552380 19170 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 01:19:00.554742 master-0 kubenswrapper[19170]: I0313 01:19:00.554700 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 01:19:00.554812 master-0 kubenswrapper[19170]: I0313 01:19:00.554757 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 01:19:00.554812 master-0 kubenswrapper[19170]: I0313 01:19:00.554771 19170 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 01:19:00.554913 master-0 kubenswrapper[19170]: I0313 01:19:00.554892 19170 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 01:19:00.591169 master-0 kubenswrapper[19170]: I0313 01:19:00.591121 19170 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 13 01:19:00.591445 master-0 kubenswrapper[19170]: I0313 01:19:00.591228 19170 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 01:19:00.919815 master-0 kubenswrapper[19170]: I0313 01:19:00.919617 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:19:00.921377 master-0 kubenswrapper[19170]: I0313 01:19:00.921325 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecaa68379bdd11e9423cf73d3d2c095307d5c8cb6abca3a8566ead3a1f24f111" Mar 13 01:19:00.921474 master-0 kubenswrapper[19170]: I0313 01:19:00.921389 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-cnrhm","openshift-marketplace/redhat-operators-k52lh","openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg","assisted-installer/assisted-installer-controller-qpxft","openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf","openshift-kube-controller-manager/installer-3-retry-1-master-0","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7","openshift-machine-config-operator/machine-config-daemon-pmkpj","openshift-marketplace/redhat-marketplace-z254g","openshift-network-operator/network-operator-7c649bf6d4-bdc4j","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5","openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6","openshift-kube-scheduler/installer-6-master-0","openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g","openshift-monitoring/metrics-server-5575f756f4-hqr5q","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc","openshift-marketplace/community-operators-bbptx","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn","openshift-machine-config-operator/machine-config-server-4gpcz","openshift-multus/multus-admission-controller-7769569c45-zm2jl","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g","openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b","openshift-multus/multus-additional-cni-plugins-xn5t5","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9","openshift-cluster-node-tuning-operator/tuned-9vzj5","openshift-network-operator/iptables-alerter-qclwv","openshift-apiserver/apiserver-69c74d8d69-jpj8z","openshift-marketplace/marketplace-operator-64bf9778cb-dszg5","openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg","openshift-kube-controller-manager/installer-3-master-0","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm","openshift-etcd/etcd-master-0","openshift-insights/insights-operator-8f89dfddd-6k2t7","openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb","openshift-machine-api/machine-api-operator-84bf6db4f9-zt229","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j","openshift-dns-operator/dns-operator-589895fbb7-qvl2k","openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj","openshift-multus/multus-rvt5h","openshift-oauth-apiserver/apiserver-78885b775b-jrrjv","openshift-dns/dns-default-26mfw","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf","openshift-controller-manager/controller-manager-757fb68448-cj9p5","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8","openshift-network-node-identity/network-node-identity-znqwc","openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m","openshift-service-ca/service-ca-84bfdbbb7f-qr9tk","kube-system/bootstrap-kube-scheduler-master-0","openshift-kube-apiserver/installer-1-master-0","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr","openshift-multus/network-metrics-daemon-zh5fh","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq","openshift-ovn-kubernetes/ovnkube-node-v56ct","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh","openshift-multus/cni-sysctl-allowlist-ds-hdx2d","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8","openshift-etcd/installer-1-master-0","openshift-monitoring/node-exporter-2hgwj","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8","openshift-ingress-operator/ingress-operator-677db989d6-kdn2l","openshift-kube-scheduler/revision-pruner-6-master-0","openshift-marketplace/certified-operators-9zvz2","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz","openshift-dns/node-resolver-lw6xm","openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h","openshift-network-diagnostics/network-check-target-xs8pt"] Mar 13 01:19:00.921875 master-0 kubenswrapper[19170]: I0313 01:19:00.921830 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-qpxft" Mar 13 01:19:00.929218 master-0 kubenswrapper[19170]: I0313 01:19:00.929159 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 01:19:00.942896 master-0 kubenswrapper[19170]: I0313 01:19:00.942844 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 01:19:00.943137 master-0 kubenswrapper[19170]: I0313 01:19:00.943092 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:19:00.943187 master-0 kubenswrapper[19170]: I0313 01:19:00.943135 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 01:19:00.943391 master-0 kubenswrapper[19170]: I0313 01:19:00.943359 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.943750 master-0 kubenswrapper[19170]: I0313 01:19:00.943702 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 01:19:00.943934 master-0 kubenswrapper[19170]: I0313 01:19:00.943912 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 01:19:00.944532 master-0 kubenswrapper[19170]: I0313 01:19:00.944482 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 01:19:00.944802 master-0 kubenswrapper[19170]: I0313 01:19:00.944762 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 01:19:00.945050 master-0 kubenswrapper[19170]: I0313 01:19:00.945020 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 01:19:00.951891 master-0 kubenswrapper[19170]: I0313 01:19:00.951861 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.952021 master-0 kubenswrapper[19170]: I0313 01:19:00.951989 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 01:19:00.952325 master-0 kubenswrapper[19170]: I0313 01:19:00.952297 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 01:19:00.952953 master-0 kubenswrapper[19170]: I0313 01:19:00.952928 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 01:19:00.953015 master-0 kubenswrapper[19170]: I0313 01:19:00.952946 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 01:19:00.953049 master-0 kubenswrapper[19170]: I0313 01:19:00.953035 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 01:19:00.954902 master-0 kubenswrapper[19170]: I0313 01:19:00.953890 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.954902 master-0 kubenswrapper[19170]: I0313 01:19:00.953956 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 01:19:00.954902 master-0 kubenswrapper[19170]: I0313 01:19:00.954084 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 01:19:00.954902 master-0 kubenswrapper[19170]: I0313 01:19:00.954298 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 01:19:00.955154 master-0 kubenswrapper[19170]: I0313 01:19:00.955134 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 01:19:00.956882 master-0 kubenswrapper[19170]: I0313 01:19:00.955620 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 01:19:00.956882 master-0 kubenswrapper[19170]: I0313 01:19:00.956741 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Mar 13 01:19:00.957050 master-0 kubenswrapper[19170]: I0313 01:19:00.957001 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-master-0" Mar 13 01:19:00.959511 master-0 kubenswrapper[19170]: I0313 01:19:00.959459 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" Mar 13 01:19:00.960608 master-0 kubenswrapper[19170]: I0313 01:19:00.960339 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 01:19:00.961277 master-0 kubenswrapper[19170]: I0313 01:19:00.960800 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.961277 master-0 kubenswrapper[19170]: I0313 01:19:00.961117 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:00.965916 master-0 kubenswrapper[19170]: I0313 01:19:00.965900 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 01:19:00.966110 master-0 kubenswrapper[19170]: I0313 01:19:00.966098 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 01:19:00.966224 master-0 kubenswrapper[19170]: I0313 01:19:00.965920 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerStarted","Data":"bc64b6329645b722c1cc45a7bfce3843288d247124cc2d19dde983135ddcc23b"} Mar 13 01:19:00.966314 master-0 kubenswrapper[19170]: I0313 01:19:00.966291 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerDied","Data":"4f1d391f9ccf9712ce599023f3ef26e7463c6ad87dcaaba9b59f13a56ea3cd24"} Mar 13 01:19:00.966388 master-0 kubenswrapper[19170]: I0313 01:19:00.966369 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" event={"ID":"95d4e785-6663-417d-b380-6905773613c8","Type":"ContainerStarted","Data":"2caee3a1af6093e4a6a8dc6eb703af3a181ab48dbf5acd65ef8e61a08e467534"} Mar 13 01:19:00.966469 master-0 kubenswrapper[19170]: I0313 01:19:00.966454 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"c03513c285f73dedbe67dedcdcf65f9b9e4e6c146f0a64e7433f278ca1844469"} Mar 13 01:19:00.966536 master-0 kubenswrapper[19170]: I0313 01:19:00.966524 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"66d7d7db2b542e86a40857789246d4abcbc4fca4d74d92d6366f9e8e1aa401b5"} Mar 13 01:19:00.966598 master-0 kubenswrapper[19170]: I0313 01:19:00.966586 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"ef509e1db76d33faa725783e475ec527d0db77ed62b28bc3717960d043c585e4"} Mar 13 01:19:00.966695 master-0 kubenswrapper[19170]: I0313 01:19:00.966682 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"9984727ecfd751906a7163096c1bf7f2f5c369a40b443c586dd90122b84df23e"} Mar 13 01:19:00.966785 master-0 kubenswrapper[19170]: I0313 01:19:00.966771 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" event={"ID":"7f35cc1e-3376-4dbd-b215-2a32bf62cc71","Type":"ContainerStarted","Data":"32e6a9ef39eb23d211cd1a76164dad7d4bb127d13bda96645831dee3624336c5"} Mar 13 01:19:00.966871 master-0 kubenswrapper[19170]: I0313 01:19:00.966858 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" event={"ID":"7f35cc1e-3376-4dbd-b215-2a32bf62cc71","Type":"ContainerStarted","Data":"d87d8f00b3827a6cc0d679f67563557686bd72d154906a3035b8f36d3110e48e"} Mar 13 01:19:00.966980 master-0 kubenswrapper[19170]: I0313 01:19:00.966950 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerStarted","Data":"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6"} Mar 13 01:19:00.967068 master-0 kubenswrapper[19170]: I0313 01:19:00.967055 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerStarted","Data":"35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e"} Mar 13 01:19:00.967155 master-0 kubenswrapper[19170]: I0313 01:19:00.967141 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"fc0e1683bdab69600d9345034d7196f0df59bd227fe522be458cd66a46352f2e"} Mar 13 01:19:00.967243 master-0 kubenswrapper[19170]: I0313 01:19:00.967230 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"3120df40532f6b363d2622fdff9ab5bcc403e2dfd52885248c3514a1f9c6afff"} Mar 13 01:19:00.967330 master-0 kubenswrapper[19170]: I0313 01:19:00.967314 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" event={"ID":"ebf338e6-9725-47d9-8c7f-adbf11a44406","Type":"ContainerStarted","Data":"34d6500d42674d3ac28ad1da03d31ad6fc07a588196014c4a73a86965dd9deb9"} Mar 13 01:19:00.967420 master-0 kubenswrapper[19170]: I0313 01:19:00.967406 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de"} Mar 13 01:19:00.967484 master-0 kubenswrapper[19170]: I0313 01:19:00.967473 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"92c3b2f339c88995c70507230cfa25808d3c4399c710b2454c51839f6048ccf5"} Mar 13 01:19:00.967554 master-0 kubenswrapper[19170]: I0313 01:19:00.967540 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerStarted","Data":"b93822c35ff1bf1be734ea3687c5cb02996a6c6f05c19e51ce529ef4bb707376"} Mar 13 01:19:00.967619 master-0 kubenswrapper[19170]: I0313 01:19:00.967606 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerDied","Data":"28ba9aa037bbe7233c4f1b03cbf6e53b0a8d15d86593c5a55e6b347f9655f21f"} Mar 13 01:19:00.967722 master-0 kubenswrapper[19170]: I0313 01:19:00.967704 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerDied","Data":"1516e86a1559523ee925b4efda9712253b30e34e1c1002f5e1e5d03874fe6d41"} Mar 13 01:19:00.967811 master-0 kubenswrapper[19170]: I0313 01:19:00.967374 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 01:19:00.967855 master-0 kubenswrapper[19170]: I0313 01:19:00.967821 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 01:19:00.967886 master-0 kubenswrapper[19170]: I0313 01:19:00.967847 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.967886 master-0 kubenswrapper[19170]: I0313 01:19:00.967855 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 01:19:00.967943 master-0 kubenswrapper[19170]: I0313 01:19:00.967779 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bbptx" event={"ID":"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b","Type":"ContainerStarted","Data":"94e09a1dd75367c606e9c0f6209f6e945683271c1483d15ae30d37382e33a6c7"} Mar 13 01:19:00.967943 master-0 kubenswrapper[19170]: E0313 01:19:00.967916 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.967696 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.967994 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: E0313 01:19:00.968021 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.967946 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89451c54a480759544b3c9bb8a4009d254601dcef79b111432b9e309f8ee32df" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.968025 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.968080 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"f0fd861fec3dc2f0ce9169e8cf3c411b63bb224d503bc3cc3463cc4f3e8118f2"} Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.968119 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"ea64d3b4313780d7d13a3dab7935308441248f41376a68c4808300e2ebba56b2"} Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.968136 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"78bc15b632613235d2195bdf740a5aaab2a5677c4f8c20084e1234dbfa6c8a91"} Mar 13 01:19:00.968145 master-0 kubenswrapper[19170]: I0313 01:19:00.968150 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" event={"ID":"cd7cca05-3da7-42cf-af64-6e94050e58c0","Type":"ContainerStarted","Data":"b26809d00df2d88f0387eef7498f3d90150a196ebaa102f4f43bf51209c487a9"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968164 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"a45dea6369e1284e4d6cd6145c197a33c24d0024d01b01a27e517f9663b1c0a5"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968178 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerDied","Data":"70de036e29496f4383b5254fadb7f8de5e3a8b0ccfbf1cbf424f334a6aaf519d"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968193 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" event={"ID":"f9b713fb-64ce-4a01-951c-1f31df62e1ae","Type":"ContainerStarted","Data":"8f2613fc06a65ee0e558a2b7e31f18cf27ca9c4c8e8d8a194c2a8dcc4466dc64"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968206 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" event={"ID":"85149f21-7ba8-4891-82ef-0fef3d5d7863","Type":"ContainerStarted","Data":"2e8d9bdcd6f94bc5d59dda8365233249d91bac104c7683389de5c7d81691e53d"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968219 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" event={"ID":"85149f21-7ba8-4891-82ef-0fef3d5d7863","Type":"ContainerStarted","Data":"61a9c110238dfe2ea95596a01db7cf5d6ef10b0b43d6a9827c08c09d64d82e79"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968232 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerStarted","Data":"304e8e0944434514608c776ed75bc07cb5d1c2603e8ab5214e26636517baa5e9"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968246 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerStarted","Data":"fdd7225d6e1fca05e0159dd0bc6d0e1e7a5fef522b8f6ef01baea69ca9c582b3"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968260 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xs8pt" event={"ID":"d5456c8b-3c98-4824-8700-a04e9c12fb2e","Type":"ContainerStarted","Data":"4e35903107c5db9cc7e1ad31b326fabe4f51fe882ba6656d7c5cc78d9dd54e9b"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968274 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xs8pt" event={"ID":"d5456c8b-3c98-4824-8700-a04e9c12fb2e","Type":"ContainerStarted","Data":"f8dd90e4b919a4750dacd366cb8ce8129d02c4f3f75302771450ca85e994151e"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968287 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"7d069f7cf40ce00e10c1f0f6baa994ec7a0d37d154f8f16c691fae327fe2644d"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968301 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerDied","Data":"a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968317 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"b656a6f1c70c3edb8a88d273e10ec19afe3e617046ee184903275fabe65867b3"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968329 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerStarted","Data":"045cf1f0774d199db7359b9c5ba3caaa20fbbe198d9372303441cd3a9663f259"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968343 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerDied","Data":"ea2d52d5a4050a1c6648d96fe621d92a024e84b0306d25332de51586b15ec9dd"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968382 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" event={"ID":"bfc49699-9428-4bff-804d-da0e60551759","Type":"ContainerStarted","Data":"89ed96ec0c29ed024929723b7fe3a674507469f0ef8e3cce0e32599fd800079d"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968402 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"7064752da8b9ab2c25c9d26191f5ff198db26d98b3ebd28b1a794ad9c42435a8"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968417 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"a6d1854232de9168b75b3d01a0fa6c20901eb4deee5086ea18aafc73d6104cff"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968432 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" event={"ID":"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a","Type":"ContainerStarted","Data":"b4d8f238475f0e2a34d731e501507dcca53539022959bc9fece8f8ff1323377c"} Mar 13 01:19:00.968428 master-0 kubenswrapper[19170]: I0313 01:19:00.968444 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" event={"ID":"6a5ab1d5-dabd-45e7-a688-71a282f61e67","Type":"ContainerStarted","Data":"01455eeb058734a90460549b36e203977c395814caa7b919b04ec224f499fd04"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968458 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" event={"ID":"6a5ab1d5-dabd-45e7-a688-71a282f61e67","Type":"ContainerStarted","Data":"36c90a63dbc503a0e102d326d7589d3b1f80da94a54e8faa945f71ea63acc949"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968471 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"85a02a7779e2a01415e0d12cd3a306e827d6428398eea489dff1c9d2909a65c4"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968483 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"e1abc99b61310b8078d12894917e6a71003139ee52ec277667d925e2d84f6589"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968496 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" event={"ID":"d278ed70-786c-4b6c-9f04-f08ede704569","Type":"ContainerStarted","Data":"9fa79744aaaa4051964ccef7d38d65ee7510ee60bd1d8d59df2b9df1c6c707da"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968511 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968527 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968097 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968557 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968617 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968619 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968660 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968673 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.967785 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968689 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968705 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968716 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968728 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968123 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968749 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968728 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"64cf82ec26e7e72865d114924f8764655e701499c4313b25c26bbe5acc40878c"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968802 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"4f49595589f1bf5fa753cf5b619410e098b50fc20413b400a546391cb2022bf0"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968821 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" event={"ID":"64477504-5cb6-42dc-a7eb-662981daec4a","Type":"ContainerStarted","Data":"a60b4d927a33db793635462ea60ad60607d5d184924a1912214647b401b2b973"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968831 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerStarted","Data":"4f15414965bd82c8ca70ef81dc14cd39c996705ef15d04f56d1a5564c3ede50b"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968844 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerDied","Data":"a41bcaf653995a95790b4be685f8a8f91dff8546aa69d956c2d939af740c0286"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968857 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" event={"ID":"916d9fc9-388b-4506-a17c-36a7f626356a","Type":"ContainerStarted","Data":"6cd94cfc20909e4d75ca589ee9ae860bb4dc06f0ab09921539ad1f3f2923f207"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968869 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerStarted","Data":"caed5b6f3c3e0f496672dafebe4cf87db5086e2ed7b7df39114a4e1a8f3fa33f"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968882 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerDied","Data":"408fa86e57d7c0ed7566e66a9206de42b73c3a8d5d5b9b39423211e50e66920f"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968895 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" event={"ID":"57eb2020-1560-4352-8b86-76db59de933a","Type":"ContainerStarted","Data":"bdd5af34bfad236139e626fcebfb16719c123b2551b988ca1c04bcedf0b2fdb1"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968906 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" event={"ID":"ac2a4c90-32db-4464-8c47-acbcafbcd5d0","Type":"ContainerStarted","Data":"3c6ff0fe1111a981e0f82680b651025845befe1e63f28e04da68d060f2f82f77"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968921 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" event={"ID":"ac2a4c90-32db-4464-8c47-acbcafbcd5d0","Type":"ContainerStarted","Data":"8e6b36b6fc7f96835dacfb4827ca38e8971ee7d14058842c077463b70715d284"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968933 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerStarted","Data":"62d2b92f5805220707a2be14d18659481c24419c7f112e9e794398a7182f05dd"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968776 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968961 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968947 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerDied","Data":"d3c0f89339f815b6350fac22cc030760b6b90e8219eb2eb8f1fd3b1e19f0b649"} Mar 13 01:19:00.968973 master-0 kubenswrapper[19170]: I0313 01:19:00.968137 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968195 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969018 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" event={"ID":"21cbea73-f779-43e4-b5ba-d6fa06275d34","Type":"ContainerStarted","Data":"cf319252fb389a321939570359e80da282cec58b5fa4e03fa5a5ea1c1c063fe4"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969056 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"d06826ec4775f3b8e7bc8c8c50364ebde1bf5d008653ef1eefd79f82a03cb948"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969079 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"09062ce5a8bfb21fa147439adb4aa83615da8c3397079fc199132224f09f0600"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969100 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerDied","Data":"77742db3e18b710fed8057a5ff63f6e99d45794674fb37f85d739e62dd3a751e"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969115 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" event={"ID":"d56480e0-0885-41e5-a1fc-931a068fbadb","Type":"ContainerStarted","Data":"6c434288b6cb191d119d6c19e3587bb5a67d5e3c45645324bb8a04e648bb9b70"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969133 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969149 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968116 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969165 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969184 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969198 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969212 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969233 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"616c3040128a0d65e3ebb99fac58cf591c2a39e4fb92682249d6c1a457260134"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968194 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969257 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae486c0fd97e5367a80e37dacc128e4883d8d7004786d375adff859d49d1bf07" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969273 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" event={"ID":"f97819d0-2840-4352-a435-19ef1a8c22c9","Type":"ContainerStarted","Data":"4c5fb20dad21e9e3e37f291a5be6622a0a622dd0a6d9ba5a22b729aeb465b9cc"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969284 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968235 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969289 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" event={"ID":"f97819d0-2840-4352-a435-19ef1a8c22c9","Type":"ContainerStarted","Data":"3f750f4eaadd11866936791933f7a3cbf786b838bf1e7a9f9142487b42787b0b"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969346 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" event={"ID":"2ce47660-f7cc-4669-a00d-83422f0f6d55","Type":"ContainerStarted","Data":"4b6a4d1e81b6f153adb00d6cd286fed3da85b58d04c7b67cb99e1e1a37cd143a"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968254 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969367 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" event={"ID":"2ce47660-f7cc-4669-a00d-83422f0f6d55","Type":"ContainerStarted","Data":"50afa0bafdfdadd430cb50b2aa81b0c11200da9c802e7cb966b1902e4941db5a"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969382 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969395 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"81594a611904e6b6b1a33993523b7420d7c605395323b0ff7a70b4475e6f0b5c"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968283 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969406 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerDied","Data":"4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969451 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"eeedfbb568950a2005b49c940a6eb5e45d4af2d8ddb401839d8110cff9f9ae07"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968288 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969471 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"99dc9aacbe53f2463fbb1d6c45782c44f72e7b13c67642bb7d0b4839b16638fe"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969493 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerDied","Data":"8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969510 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"38fb6af13de86e0d88a936d042a22e81831b9af5722fa376d1a0a0fe523b846b"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968291 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969525 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"a4456327b2e0b6b9539a6903b6632e2c8ba74468c5ee5a6acc2e786114ebf53d"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968324 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969540 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"884a792f041730857627043a59c1f19997609417d4c5da58d81f2f5237f075b1"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969601 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"ecb5a113058bb964fe9c155a5ae981bb13b52682de0601fdacf2ecfeb3ca0ddc"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968325 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968335 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968406 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968413 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968441 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969616 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" event={"ID":"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3","Type":"ContainerStarted","Data":"ba1138e615d45d30d4c79c1da4c10f696c22641dfec7a0ee3bb68226581ee820"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969881 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" event={"ID":"ae44526f-5858-42a0-ba77-3a22f171456f","Type":"ContainerStarted","Data":"250a0e47e1f825144c66cdf0edf6dd832a93865a0f2ebf659c116a8fb949ff67"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969895 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968474 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969897 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" event={"ID":"ae44526f-5858-42a0-ba77-3a22f171456f","Type":"ContainerStarted","Data":"b53f7152f43c94f2398d1b740ffecfeaa1c0a37491b92964a127cf3cd4f8b71f"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968482 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969943 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"c75774122f86364cbb2037bd356bbbad5d4c638ff9b0ea91ed1c5645b4b0a5e2"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969963 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"c8bfd0ac311b1657eee9f6e460d76c3b97867545a733bd30edc441dfb4a82394"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968501 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969977 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" event={"ID":"2679c6e1-11c1-450c-b03a-30d7ee59ff6f","Type":"ContainerStarted","Data":"3ac90b7e141885c73870d9744a9126cb8648da1eed1822b13844b812ecb6dc82"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.969992 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerStarted","Data":"ac8bc98fe2e8dc99665ece6e4cbb170176bcf297370531768be4fdddc77674cc"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970007 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerDied","Data":"662b7543988e07c43f9b30d00fca727f77728c7aa21bd39d21414f56d158c6c9"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970023 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" event={"ID":"58035e42-37d8-48f6-9861-9b4ce6014119","Type":"ContainerStarted","Data":"ab94900114a9122f23168251a131a59bc4f99b487faa4afb3e3bd743e5f9a00c"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970036 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" event={"ID":"0671fdd0-b358-40f9-ae49-2c5a9004edb3","Type":"ContainerStarted","Data":"1e0fcce31d2e2166ce78dfb55bc928ed4de8e1614fc84db1f62c529d76a3c284"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970050 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" event={"ID":"0671fdd0-b358-40f9-ae49-2c5a9004edb3","Type":"ContainerStarted","Data":"7d7ebfb8fe2ccbcaa9d3a1c3f2519c36849a66ccd8f79f4fea8a8ff50785f679"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970066 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" event={"ID":"da44d750-31e5-46f4-b3ef-dd4384c22aaf","Type":"ContainerStarted","Data":"3745bbc5e84b13f752a8050e8fc01499f2fd5e37e8cd7566db3715cd3974a077"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970079 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" event={"ID":"da44d750-31e5-46f4-b3ef-dd4384c22aaf","Type":"ContainerStarted","Data":"d0d7ba4bdbd45759b508d00f36d1e06281f843bb6e1de6ed64932952a8078e77"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970089 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerStarted","Data":"291d7f234c91da53973dfeea878525a20e6b8e9491a00bb5aa5e9bac339b437f"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968686 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970101 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerDied","Data":"0e6fdad2e1926f784b1c498cd01186eeb32850cc4a0f69925bc0668ef060c2a8"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970121 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" event={"ID":"22587300-2448-4862-9fd8-68197d17a9f2","Type":"ContainerStarted","Data":"7d8cfecb961af50ca95aeb8f7e1e1b3b55dbc52640073a438412ecf24f225a00"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970137 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"9d118311f33a13bf01d58b99e2e28890870103c6d6d9e80b3f327feb4a6e5c10"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970152 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"7146d2748a69888b0e230f968d6a455dc052e3a4f925338980f5ac24afb23fd4"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968806 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.970161 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"6ecf1cf9a4925a48c1305c992f6b26c6dc5493f27b0413a75a2a0cbd559a27b9"} Mar 13 01:19:00.970321 master-0 kubenswrapper[19170]: I0313 01:19:00.968844 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.973130 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.973330 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.973360 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.968851 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.968849 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.968863 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.974055 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 01:19:00.974188 master-0 kubenswrapper[19170]: I0313 01:19:00.968890 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 01:19:00.974537 master-0 kubenswrapper[19170]: I0313 01:19:00.968893 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 01:19:00.974806 master-0 kubenswrapper[19170]: I0313 01:19:00.974740 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 01:19:00.974852 master-0 kubenswrapper[19170]: I0313 01:19:00.974825 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"1f339c4da756baa27443d470023f6e1410367639df127c26cecf3952f778ca16"} Mar 13 01:19:00.974911 master-0 kubenswrapper[19170]: I0313 01:19:00.974858 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"b8addabb4549d659b9d15764990d3747","Type":"ContainerStarted","Data":"0072057986d1a9c35e19db3f7ab2650e875b4c3fecae35f046b875511fe06154"} Mar 13 01:19:00.974911 master-0 kubenswrapper[19170]: I0313 01:19:00.974874 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"ee09247c905e7ec574cda5fc1232f0742111775db623db5f36b780f4329bda02"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.974981 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"7c8d8dec582874bddc8ecb01a398186bbbb8f8957e5e30a3464ac033e65b39ee"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975002 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" event={"ID":"0d4e6150-432c-4a11-b5a6-4d62dd701fc8","Type":"ContainerStarted","Data":"04292570efd4dfe3d1e939e8bffe6e36601999e7f873882bc75b098b25f400d8"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975018 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerDied","Data":"6b91817f5b3f7a0651a092d44f47916346942c3944860ca84cb9f688537c7ce3"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975051 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"0b3a64f4-e94f-4916-8c91-a255d987735d","Type":"ContainerDied","Data":"555967ac1c8966a10024222c10bd15df837fc12752f180fd00e26584a6a7eadd"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975065 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="555967ac1c8966a10024222c10bd15df837fc12752f180fd00e26584a6a7eadd" Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975079 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qclwv" event={"ID":"46662e51-44af-4732-83a1-9509a579b373","Type":"ContainerStarted","Data":"e5e42cc233087fd83bca82d1a9f888115d5c204c4f6708e02a100dc0a15fb91c"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975092 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qclwv" event={"ID":"46662e51-44af-4732-83a1-9509a579b373","Type":"ContainerStarted","Data":"7ad0a5d9a8c967edc56620551d2d496053e4222b70097c6bc30b98a2ef86a101"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975107 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" event={"ID":"6c88187c-d011-4043-a6d3-4a8a7ec4e204","Type":"ContainerStarted","Data":"9b9e910c886ae717b561817a7ea8bb0a6f52815840a3145454514557347be4d2"} Mar 13 01:19:00.975105 master-0 kubenswrapper[19170]: I0313 01:19:00.975121 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" event={"ID":"6c88187c-d011-4043-a6d3-4a8a7ec4e204","Type":"ContainerStarted","Data":"a628e92ac4f34b60f238b76d4fc08c8cab73f3dfd7d9d1150c95d95292472f21"} Mar 13 01:19:00.975484 master-0 kubenswrapper[19170]: I0313 01:19:00.975136 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"296a827430c5ddc3613e12467f9f67e5f7d7d6b28473db1f835af9250c8da399"} Mar 13 01:19:00.975484 master-0 kubenswrapper[19170]: I0313 01:19:00.975163 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:19:00.975567 master-0 kubenswrapper[19170]: I0313 01:19:00.975485 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 01:19:00.975567 master-0 kubenswrapper[19170]: I0313 01:19:00.975152 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"ec1033057b9e888b0f7503a93c72842a5c6f60d6f3a4f15b0b1a235b091ecfbd"} Mar 13 01:19:00.975703 master-0 kubenswrapper[19170]: I0313 01:19:00.975585 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zh5fh" event={"ID":"e68ab3cb-c372-45d9-a758-beaf4c213714","Type":"ContainerStarted","Data":"b6d8f47788f03e55d3eeba0c1c7d2d37374cfceb0c113268fcb5709d2ce6f28c"} Mar 13 01:19:00.975703 master-0 kubenswrapper[19170]: I0313 01:19:00.975657 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 01:19:00.975793 master-0 kubenswrapper[19170]: I0313 01:19:00.975606 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerStarted","Data":"3119424a083f353a4b6183c0dad15c0796902de158ccd0a6a3f2774dc5ffa101"} Mar 13 01:19:00.975793 master-0 kubenswrapper[19170]: I0313 01:19:00.975773 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerDied","Data":"e632cd30baff456f3e4dd542f0b7173c07418706f4cc94eff880efa35261e3c0"} Mar 13 01:19:00.975793 master-0 kubenswrapper[19170]: I0313 01:19:00.975789 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerDied","Data":"aa38a63f384b7da874941350b67401ef32b738eda3d617175799f0520f0661d5"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975806 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k52lh" event={"ID":"3f9728b4-4e1e-4165-a276-3daa00e95839","Type":"ContainerStarted","Data":"4a7eca1172ea3bfa17610dd111962352ef959bb3d01e721ed6e19fcbca116334"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975820 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvt5h" event={"ID":"2937cbe2-3125-4c3f-96f8-2febeb5942cc","Type":"ContainerStarted","Data":"af254e609bac0d7dd38fb4c0fff04b08f4827ca415e3d02f5dce013a9d0ee8c7"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975836 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvt5h" event={"ID":"2937cbe2-3125-4c3f-96f8-2febeb5942cc","Type":"ContainerStarted","Data":"609f306bf915c2fd338574a4571df8d99042d408bdae6c3187fb345e69136829"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975850 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" event={"ID":"77fd9062-0f7d-4255-92ca-7e4325daeddd","Type":"ContainerStarted","Data":"56e23dc047d0c9af7251a6f497704d60dfa2828d26b6a71ca2f42af20d7203ee"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975864 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" event={"ID":"77fd9062-0f7d-4255-92ca-7e4325daeddd","Type":"ContainerStarted","Data":"9abb90df1fb36f7d743ddb849ea400a46f15eae6ffadde3a44f5e1ad0528227b"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975885 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"fd1d18f6baa95b22bed3d37f8927776b5c5d98b2e99e7637fd5820559ef6427b"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975899 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"3a10917547442a55f0af92439b344052e9b73ef7d7bdf470aada2ad5959830bc"} Mar 13 01:19:00.975918 master-0 kubenswrapper[19170]: I0313 01:19:00.975910 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" event={"ID":"93871019-3d0c-4081-9afe-19b6dd108ec6","Type":"ContainerStarted","Data":"4a82c4d1f4dd0703b5ca166acc233f980e0154f3140fea6bcb51b2baef68cc9c"} Mar 13 01:19:00.976162 master-0 kubenswrapper[19170]: I0313 01:19:00.975924 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerStarted","Data":"63e66c703747bc64e98b5e738eda042e23712aeb8b127f226dc9a93942823bdc"} Mar 13 01:19:00.976162 master-0 kubenswrapper[19170]: I0313 01:19:00.975939 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerDied","Data":"2a303738597489cb37ad3f267fff17f1f61cb2e88b4598e4de81e45d9cdb8d55"} Mar 13 01:19:00.976162 master-0 kubenswrapper[19170]: I0313 01:19:00.975954 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerDied","Data":"682be9d920ceec9bf69789866cf8eedebb71157fd9c01901ddaedd2fde2be709"} Mar 13 01:19:00.976237 master-0 kubenswrapper[19170]: I0313 01:19:00.975973 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zvz2" event={"ID":"d23bbaec-b635-4649-b26e-2829f32d21f0","Type":"ContainerStarted","Data":"2b8d1e7059727645025fa018d52de63bdf9a901809577c6a277333b99385dad9"} Mar 13 01:19:00.976237 master-0 kubenswrapper[19170]: I0313 01:19:00.976205 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"e57f50a96016f374be74d1fdff3ad902d70b80cfc4848a5c9d8694184d265ad5"} Mar 13 01:19:00.976237 master-0 kubenswrapper[19170]: I0313 01:19:00.976224 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"7519572e1e1852f1b86cc906ba51b3ab5a510ff419648327c2c8d44723336143"} Mar 13 01:19:00.976322 master-0 kubenswrapper[19170]: I0313 01:19:00.976238 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" event={"ID":"13fac7b0-ce55-467d-9d0c-6a122d87cb3c","Type":"ContainerStarted","Data":"e2f745a0f2d01e632b68dc10f033e64587a1682e72141091b5e270ae9c9ebd96"} Mar 13 01:19:00.976322 master-0 kubenswrapper[19170]: I0313 01:19:00.976279 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459"} Mar 13 01:19:00.976322 master-0 kubenswrapper[19170]: I0313 01:19:00.976293 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869"} Mar 13 01:19:00.976322 master-0 kubenswrapper[19170]: I0313 01:19:00.976317 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"a98b9ab8613fc067f4fb6c5ed8cb15effc604a80e170f3f4752b7a8241625877"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976332 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"5facc13367bd2fec27a111e6734950591c5fb3c40b9c12943601a819d288d978"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976382 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"d65bf4cd73d878c1128e2da34864164f7258ebf2fe36dafd0cbc33e6915ed700"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976398 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"47c4abc061aa37ed56eb936e84b7d539b1fd1e8cec9bc0cb2e371456dc167bdc"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976410 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" event={"ID":"30f7537e-93ed-466b-ba24-78141d004b2f","Type":"ContainerStarted","Data":"24c5ca2ad81d656ed30391cef58568917e68e001311f4004a9e7bd98e34738e8"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976422 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerDied","Data":"90f1b6677182d94c10ac2334ab8747391e467bc270fab2bd3e7e7b1c8a3cd1c7"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976438 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-master-0" event={"ID":"9c9c1c81-eae9-4481-9870-b598deb1dcac","Type":"ContainerDied","Data":"39a55e527445a96c03fa72bdbedc444549a950a2be802dd370d0f86349876a95"} Mar 13 01:19:00.976465 master-0 kubenswrapper[19170]: I0313 01:19:00.976459 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a55e527445a96c03fa72bdbedc444549a950a2be802dd370d0f86349876a95" Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976473 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerStarted","Data":"37ec0a7d20e51b2b48319f3f798e03141088e89b283f96688c4513f6cdc84e01"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976516 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerDied","Data":"2703e40a08051a608961078f9b2c331b07b8ffa237b00eb643f4e928fb008663"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976533 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" event={"ID":"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35","Type":"ContainerStarted","Data":"e4f5650e90a0b9cd7d74ca56ce88b46d396150848fd499e2751500a89710ed92"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976548 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerDied","Data":"8803ea2eb582c5693311e889e291d05e3059cc337f89e85079fab8e693f3beb8"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976569 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-retry-1-master-0" event={"ID":"728435e4-9fdb-4fea-9f5b-eb5ff5444da0","Type":"ContainerDied","Data":"2a77f8e58ee6b4b9d8e8d1a5c1202e86b111b4dbd37bf30068295cac4daecf86"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976584 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a77f8e58ee6b4b9d8e8d1a5c1202e86b111b4dbd37bf30068295cac4daecf86" Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976604 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerStarted","Data":"d3bb469eb3f63fc3e5e3d196d0736a8219372d03c2b49b9f26be6e3281573d4c"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976622 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"85752463126f89fa0e5e1418516974da87fce8b92150573ae7e0d2915937dc43"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976652 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"4d990b5e894ae9b6e30a48b23a8c0721805ad4a05730ef2a8a80e7f39e6f738b"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976667 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerDied","Data":"e25bc60853f66a5d6c7e1021efdd8403103d53c529624ce5e308b8d3dfb44aaf"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976681 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" event={"ID":"61fb4b86-f978-4ae1-80bc-18d2f386cbc2","Type":"ContainerStarted","Data":"26154ef4eac655efbc62eb7de132a97eb19bcf76904f56cfb67a890cba3eae81"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976694 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"9633e444d804baebdb37261934de23f2bd534d4b2872dabf85f686f775c2846b"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976707 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"a2974c6f36c1f49a4f8394ff7b23640dbad58229e8102c2955715ea80ccef7a7"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976727 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-26mfw" event={"ID":"58405741-598c-4bf5-bbc8-1ca8e3f10995","Type":"ContainerStarted","Data":"2ea2257b817f7a593cf8a5bc18fd54c7de892a301e19617876be4cc31d01237b"} Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976742 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc87a91992c83de0be19f6bf8022aaa5b7c2b27eb42c5753a91971d10b3622e" Mar 13 01:19:00.976741 master-0 kubenswrapper[19170]: I0313 01:19:00.976755 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"a6f73296b53d2256d9223dd0adbce2bbf7733e0e109d3be3cdcc7fc586852e7f"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976841 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"30b9d5e645090bb14ab86716310f41fee385833f79403de6a30e627d0d0e329a"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976857 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"6f3c125406dd049ae146bd63c19d6c7751af1aaba13f654faef7c93feda70502"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976870 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"058d442f8a7b28b68d71c3cb585c141212a638e67e2da68b2c8ad34aca404bce"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976882 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"af9d5d0f91bc0e656e6c91ebb49055a49afe9bcd74078a12229c0c2fefb58c67"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976899 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"2beb02912bdaac5a89bd9c0a35f1a0ecbbb1b7712fbf4f2d4b727635965b2220"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976911 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"304c6fa572f5a2a3d04dd53d579bc265c0916e17447bcb96ea06ac71632cf34e"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976925 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"e9580dbc7c8945af7e90dbb4d0ac5dd7e5416bc26b81c8682f1990f08179f549"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976938 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerDied","Data":"d00fb05f88d59786ab92f821f00f790d94c0eeac3280854affdf40137d7e87d0"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976959 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" event={"ID":"4edb3e1a-9082-4fc2-ae6f-99d49c078a34","Type":"ContainerStarted","Data":"4aad5a88dbaf47dc9a34673f221901d51a030ad11786e8c666462ce349290777"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976973 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" event={"ID":"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7","Type":"ContainerStarted","Data":"a0beff0548dc89e3eddfbc4d73bdac22ebf75fa5e75296d06029066e2708e943"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.976992 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" event={"ID":"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7","Type":"ContainerStarted","Data":"5f0a23a29ec1be227442f950c7b43af141e31a2152ab46cc286a5229950b1bae"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977005 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" event={"ID":"4c5174b9-ca9e-4917-ab3a-ca403ce4f017","Type":"ContainerStarted","Data":"4d358f5b4f38fb2c37cd9308f0839588cdfa1ee1a0977394634ae2dfe045b4b7"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977021 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" event={"ID":"4c5174b9-ca9e-4917-ab3a-ca403ce4f017","Type":"ContainerStarted","Data":"37ae98838b10d25174f8c8b9b4755c11a0c75ddb83ec17a66bff3690d741da86"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977033 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerDied","Data":"f3c19acecbccf7bd6716e7d44a9b0fc9bb63ca007ca5d04b416b934ef2cbe52c"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977048 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"690f916b-6f87-42d9-8168-392a9177bee9","Type":"ContainerDied","Data":"e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977059 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a9f8a9b9f4f12cdb02f0899fd6c6ca89ad08ee7dce767d1b7f5c4f67fb87f9" Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977069 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"7ef67c4dbd8426a3e3af7aa349a5cbaefed2fde80e4c7f48ba81fe002ea31f34"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977088 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"c51cdc52a7907953fa7b2f33b8d5256b59068c681441b3e967443332162a6acd"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977100 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerDied","Data":"5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977112 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"57d04e10dfd8defea76ad2a3d813dd852bfb7b6eaab6dec0d628dfff952603f9"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977126 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a4e35d8f4be284a1dfb072b64498826ea9b5084519871759d3cc21b8a8778d" Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977136 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerDied","Data":"71b98806c78a21853872bf216fdc04280da7bf4d8777bb06b2a922047a6a9e8c"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977149 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"47806631-9d60-4658-832d-f160f93f42ea","Type":"ContainerDied","Data":"13bf43dd31255e64913f1edd8b9b049ea7f9baf74595bd3516213a0e530b536c"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977159 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13bf43dd31255e64913f1edd8b9b049ea7f9baf74595bd3516213a0e530b536c" Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977177 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw6xm" event={"ID":"d81bcb58-efe3-4577-8e88-67f92c645f6f","Type":"ContainerStarted","Data":"097e6bfa9c001716d816897e7296052e0bc1aaa96a6b992e354d945a460533bc"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977189 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lw6xm" event={"ID":"d81bcb58-efe3-4577-8e88-67f92c645f6f","Type":"ContainerStarted","Data":"b19253ead78138ad6ad5306377a9183f46463e03e94ad36454e95491cb2e6272"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977201 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"f24100e4c2fb97c1fbe480478718c8a70d101cd014aa2d5b877af732099a71ef"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977217 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"deb7cab1203b4a1419f7e0b1b9f09a289fe9cf31f3d2c0d970bf2d1a0aef7884"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977230 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" event={"ID":"4976e608-07a0-4cef-8fdd-7cec3324b4b5","Type":"ContainerStarted","Data":"54c97778ddb99650ecad3d1658417118e7e1ab09d346fbb4a70157b6a2cd1822"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977271 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977244 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4gpcz" event={"ID":"edda0d03-fdb2-4130-8f73-8057efd5815c","Type":"ContainerStarted","Data":"581f4518b2025f29d1124b712492b62d765adabd9251be91282ecbf98e44533f"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977454 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4gpcz" event={"ID":"edda0d03-fdb2-4130-8f73-8057efd5815c","Type":"ContainerStarted","Data":"df3c13cf240e0897eb807bf8a91255037c9e2476ca44bc2ccd1aad2e71463498"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977478 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"d4128612049d2903866c89ea3ac616fb89c5c7677c3ff52ca9d870714f95087e"} Mar 13 01:19:00.977457 master-0 kubenswrapper[19170]: I0313 01:19:00.977494 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"4c32d9e4412752c87d1ee4cd650cdb2cb23242fbddb597c5152f445f6b017bdc"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977506 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"3d12917c6787f42e0bad6cb6459212ad6bf9ab1345cc33bb530c5d6e353e83a3"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977519 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"bd55753a9d9c25937c1f18505f16f833ab2a86de0b3ecfe2d2a1fb87bd966bf3"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977532 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"f9738749750d4de58ccf5e1bc39bada13e04d80581cbfe197867d27ee7a8ad9f"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977544 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerDied","Data":"ec2e5e0e9f2f0d0bb48be3bbf455c597567e5ab58c590a1a48ffa8bb7da7c8c1"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977565 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" event={"ID":"738ebdcd-b78b-495a-b8f2-84af11a7d35c","Type":"ContainerStarted","Data":"b018527cc19e60b658984a3b2cf8d02fa83e221b23e0763c86d4b53c72e80c7e"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977579 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"f3373c04dfe5b06ce5689672c5fa9716a2e2ff1f88c17517721cb216726a9cc3"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977594 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerDied","Data":"0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977606 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"b1a3f0aaa5bfa331e385a253d06037bc576451a7d72651b9001c746da55121ba"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977618 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"f7ec163ebceef5cfb5ab7329a8164e22ce5f45f24019f15e580acb9f1392f8cb"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977646 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerDied","Data":"167e9a0418be9c64d38402cc471015911f91f7d101628f86049fb49485d8495a"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977667 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"29e096ea-ca9d-477b-b0aa-1d10244d51d9","Type":"ContainerDied","Data":"6b3d1f96b7eda0842ce0b60c494ed28d5b1988f57c59ae6dc2d45944467711cc"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977680 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b3d1f96b7eda0842ce0b60c494ed28d5b1988f57c59ae6dc2d45944467711cc" Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977698 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerStarted","Data":"3ad76288e23214748b66d20552278056bf77691ab910aa1096214002b1b63ee0"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977713 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerDied","Data":"757884fd1e54a4728f490aa384fee80c41466484bac2e993d7373c9b6d19ad0a"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977726 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" event={"ID":"039acb44-a9b3-4ad6-a091-be4d18edc34f","Type":"ContainerStarted","Data":"7d309f2fa26be03ebd9a5013c3e9be2f5c2e833ce9081aa3f25580dc684568db"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977739 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"71f9d1850bad5c508f54de1c0bfe33c2b618025214e8583b970eda19de8409dc"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977754 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"e7ab97f2a4c561db63cb663455e603d6ca1f98998fa007e41050f6e9e2778659"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977773 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerDied","Data":"bb00bc21a2b9f11b41d1750186297e3a5ca651c2efe8531d5b69fd560b0ba268"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977786 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" event={"ID":"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad","Type":"ContainerStarted","Data":"e8a8f6655cd50f04721c79f72a49d0a15534cbdaa2b73fedfb36318faac24e0e"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977800 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" event={"ID":"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71","Type":"ContainerStarted","Data":"4b247a210c27022b1561eb41d301780fd2c27c79755cbefbf94d558d94963294"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977814 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" event={"ID":"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71","Type":"ContainerStarted","Data":"339cc6449a0020231eef0158a934d4ae19f59a10f226d56a246c3dc49a8eebbe"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977827 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" event={"ID":"9db888f0-51b6-43cf-8337-69d2d5cc2b0a","Type":"ContainerStarted","Data":"2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977841 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"9c4066cdb45897ed4f69fcb12c6e6463de2070bbf91b47b272774c2582e358fc"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977855 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"705f3b1f8f6a29f9d66d96e7e64284c86692ae92fafef78a3e7d5b5411f4c2b9"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977908 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"c20b7880d0e62c91ace04a400f15380d02a7f587227b0e579de54f8b6b881459"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977923 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"b70fd8156b9269ea1d32e5bd6b505f43cc5c2cda9055f9eab294a1ae160205e2"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977936 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"f0ef6b6be1bf6464d80a4ed0d8027b70cb9fbd6888ed521a07d3f244cf4ef4f1"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977951 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"a8891c3de0ea3f8634f05d0f83839b839fc776ecaca857a79f015b6bf51d787a"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977964 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerDied","Data":"b73ff7b10ef47505ecf38b484d44b39c71e1d9b7b2e9d15b9f215185c43203db"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.977981 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hgwj" event={"ID":"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a","Type":"ContainerStarted","Data":"7fc3fe55a4a51d95eea4a3bf4856b535210e68f1c5b298f7c9a69ccfa72f28c1"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978024 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"dbffdb32298050e3d786bea05b0e0e1b7922cd3d84a8dd8e9be8f2f907195c49"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978042 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerDied","Data":"123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978056 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"dafd5daeb5af37ce7ca2009d2230447bde3ffe690f871d4715ff864e7f41bbd6"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978069 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"6f8519eea623420c40da808c6cfff53da6452162ecb364a1c82aa4dfe3545fe2"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978132 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-qpxft" event={"ID":"abf2ead5-b97d-4160-8120-28cb8a3d843e","Type":"ContainerDied","Data":"7c07bc771c953fa9d34f82960a8b9fd12b63e9a86c930f999ffe77b37e0a74ef"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978149 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-qpxft" event={"ID":"abf2ead5-b97d-4160-8120-28cb8a3d843e","Type":"ContainerDied","Data":"9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978163 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9a8a98efb949846e16cd93dedb9e2d6c0bd9dc4b4fccd2e67d0c286bfd9dcb" Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978180 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerStarted","Data":"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978231 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerDied","Data":"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978247 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerStarted","Data":"16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978260 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerStarted","Data":"eeb920b84acc3688525f08752f9228e88cce15d298099682713138fc0275698d"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978274 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerDied","Data":"18882e60a1d7cca045d564f7abc68da51216b8e9104fca3062ca7eec99d17c5e"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978288 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" event={"ID":"ca2fa86b-a966-49dc-8577-d2b54b111d14","Type":"ContainerStarted","Data":"d50b6a32815a2be8ff70be8b06ebf59a9e49fcf3be49561e0d64e6a0e5b76848"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978326 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" event={"ID":"48375ae2-d4b4-4db4-b832-3e3db1834fb9","Type":"ContainerStarted","Data":"413763d6750693e98f10ee26b7d5a65b72db3afb864d04cf8231231c2007edea"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978346 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" event={"ID":"48375ae2-d4b4-4db4-b832-3e3db1834fb9","Type":"ContainerStarted","Data":"19ce38027ae9e3d0076b8c83191fabde1e4e81b393c760835578ba3bc36b41b2"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978456 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerStarted","Data":"8eb67e42af9d810a3c1ea7a782f9e3f142578934d85597656291d9067249b1cf"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978475 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"f64c75ed084248ad496cb98f6981ac7735f162ce7e7121ef5597b4e213d85ac5"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978490 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"24a3795ab99401f37571431134fd1c761aa6f3ef1ba4c747faa0a5ee28b9f796"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978503 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"ac0c969b95b64c22e84de07c2976566813a316f1d691a27df3a1f4621768e238"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978518 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"569d90d03ceda29b5f6ff80b99725d90e6a4f9724473ba5d3146ac49efbbe232"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978535 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"03b1433799f1c9507de93fbd689d37d0b962300c0b8274b036071bcf3cc09941"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978548 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerDied","Data":"f1844314bd4c14c44c294275c228ee201df2f8be5daa877db9d32b69fb506d82"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978562 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xn5t5" event={"ID":"4738c93d-62e6-44ce-a289-e646b9302e71","Type":"ContainerStarted","Data":"71ba48ea25a9442b7ddaf6a81ce73c503ec65916615ad4703b66f198c2ddd8c0"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978575 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978588 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978602 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"331ce433a4bf4a47427394a7b370083a4bac521b9ca9a23033c6c4de736ca40b"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978624 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"9920e6a0f414f5524ee15784a68a40cecc25e4656d0ff55f7b9dbec55000a82e"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978652 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"1848ae0dc4f507d3dbd623fb664c63190062c00de5b6fa781f847cd37341986e"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978672 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" event={"ID":"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1","Type":"ContainerStarted","Data":"c50effcf2f5fc891493cebf4edbffd60b3e47884c10225842c0382c940f83e36"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978686 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerStarted","Data":"3386a3e179b760a480294045f0ff532d50506e45395937cbdf6059ad9ea50ed9"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978700 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerDied","Data":"451217a595b413aec4246a9b014bde1e3a621bb8bc794b9a2470a8f43c1c8d3b"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978714 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" event={"ID":"486c7e33-3dd8-4a98-87e3-8216ee2e05ef","Type":"ContainerStarted","Data":"4a78398e61786f95c561c9e0a3fad101b528e41e0057edc052306118db4ece14"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978735 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerStarted","Data":"7b9a1c53b30b114fafac1d461dd3e21daf6901eba384382a18dce7f6c90a33b2"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978749 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerDied","Data":"c90d5f8b2e62149395ea03f276ff599cb9a6c656f64c9d31908bac0077615d31"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978769 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerDied","Data":"76c90bf296df85a0e3ed051135157af3e8cd81617b8acdff6c18242b0b74f386"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978781 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z254g" event={"ID":"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22","Type":"ContainerStarted","Data":"cb94d42a599d6d70b76d1a519031e7bc4f9c33d2aa5f5a649a375c7a7bded9ac"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978796 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"0e4f55e2b8073de5d9074b681c317fe1d3f5790a0689fd003d0ff5fc7da43c76"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978809 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"b46f9770418285643ca5f7f35c50eeaf7e01ac86d1e69c23bd8c42ec6872497c"} Mar 13 01:19:00.979592 master-0 kubenswrapper[19170]: I0313 01:19:00.978822 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" event={"ID":"e2b5ad07-fa01-4330-9dce-6da3444657ab","Type":"ContainerStarted","Data":"be414d776e8ab0a4d034e6fb89d00100a9499bbf4d15ddb731d27f9835e7bf82"} Mar 13 01:19:00.986908 master-0 kubenswrapper[19170]: I0313 01:19:00.986536 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 01:19:00.986908 master-0 kubenswrapper[19170]: I0313 01:19:00.986736 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 01:19:00.986908 master-0 kubenswrapper[19170]: I0313 01:19:00.986861 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 01:19:00.987138 master-0 kubenswrapper[19170]: I0313 01:19:00.986947 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 01:19:00.987213 master-0 kubenswrapper[19170]: I0313 01:19:00.987166 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 01:19:00.987384 master-0 kubenswrapper[19170]: I0313 01:19:00.987364 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 01:19:00.987485 master-0 kubenswrapper[19170]: I0313 01:19:00.987462 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 01:19:00.987663 master-0 kubenswrapper[19170]: I0313 01:19:00.987626 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 01:19:00.987943 master-0 kubenswrapper[19170]: I0313 01:19:00.987911 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 01:19:00.987982 master-0 kubenswrapper[19170]: I0313 01:19:00.987963 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 01:19:00.988013 master-0 kubenswrapper[19170]: I0313 01:19:00.987986 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 01:19:00.988085 master-0 kubenswrapper[19170]: I0313 01:19:00.986969 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 01:19:00.988243 master-0 kubenswrapper[19170]: I0313 01:19:00.988206 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 01:19:00.988380 master-0 kubenswrapper[19170]: I0313 01:19:00.987923 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 01:19:00.988689 master-0 kubenswrapper[19170]: I0313 01:19:00.988662 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 01:19:00.988745 master-0 kubenswrapper[19170]: I0313 01:19:00.988725 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:19:00.989065 master-0 kubenswrapper[19170]: I0313 01:19:00.989038 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 01:19:00.989137 master-0 kubenswrapper[19170]: I0313 01:19:00.989116 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 01:19:00.997715 master-0 kubenswrapper[19170]: I0313 01:19:00.997622 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 01:19:01.000818 master-0 kubenswrapper[19170]: I0313 01:19:01.000776 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 01:19:01.001066 master-0 kubenswrapper[19170]: I0313 01:19:01.000995 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 01:19:01.002289 master-0 kubenswrapper[19170]: I0313 01:19:01.002260 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 01:19:01.006355 master-0 kubenswrapper[19170]: I0313 01:19:01.006306 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.006419 master-0 kubenswrapper[19170]: I0313 01:19:01.006365 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mn4\" (UniqueName: \"kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4\") pod \"network-check-source-7c67b67d47-5fv6h\" (UID: \"48375ae2-d4b4-4db4-b832-3e3db1834fb9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:19:01.006419 master-0 kubenswrapper[19170]: I0313 01:19:01.006396 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.006493 master-0 kubenswrapper[19170]: I0313 01:19:01.006424 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.006493 master-0 kubenswrapper[19170]: I0313 01:19:01.006452 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.006493 master-0 kubenswrapper[19170]: I0313 01:19:01.006482 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.006585 master-0 kubenswrapper[19170]: I0313 01:19:01.006505 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.006585 master-0 kubenswrapper[19170]: I0313 01:19:01.006531 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:01.006585 master-0 kubenswrapper[19170]: I0313 01:19:01.006557 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:01.006585 master-0 kubenswrapper[19170]: I0313 01:19:01.006581 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.006728 master-0 kubenswrapper[19170]: I0313 01:19:01.006603 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.006728 master-0 kubenswrapper[19170]: I0313 01:19:01.006645 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.006728 master-0 kubenswrapper[19170]: I0313 01:19:01.006689 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.006728 master-0 kubenswrapper[19170]: I0313 01:19:01.006713 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.006832 master-0 kubenswrapper[19170]: I0313 01:19:01.006734 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.006832 master-0 kubenswrapper[19170]: I0313 01:19:01.006771 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.006832 master-0 kubenswrapper[19170]: I0313 01:19:01.006796 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:01.006832 master-0 kubenswrapper[19170]: I0313 01:19:01.006821 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.006939 master-0 kubenswrapper[19170]: I0313 01:19:01.006843 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.006939 master-0 kubenswrapper[19170]: I0313 01:19:01.006867 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.006939 master-0 kubenswrapper[19170]: I0313 01:19:01.006887 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.006939 master-0 kubenswrapper[19170]: I0313 01:19:01.006910 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:01.007804 master-0 kubenswrapper[19170]: I0313 01:19:01.007502 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-config\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.007804 master-0 kubenswrapper[19170]: I0313 01:19:01.007778 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.008114 master-0 kubenswrapper[19170]: I0313 01:19:01.008080 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/916d9fc9-388b-4506-a17c-36a7f626356a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:01.008434 master-0 kubenswrapper[19170]: I0313 01:19:01.008404 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.008616 master-0 kubenswrapper[19170]: I0313 01:19:01.008588 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-images\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.008957 master-0 kubenswrapper[19170]: I0313 01:19:01.008928 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-binary-copy\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.008990 master-0 kubenswrapper[19170]: I0313 01:19:01.008950 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.009301 master-0 kubenswrapper[19170]: I0313 01:19:01.009150 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:01.009301 master-0 kubenswrapper[19170]: I0313 01:19:01.009181 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-client\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.009301 master-0 kubenswrapper[19170]: I0313 01:19:01.009204 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009312 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009358 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009379 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009402 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009423 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009439 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009459 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009477 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009493 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009510 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009528 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009544 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009560 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009575 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009566 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/916d9fc9-388b-4506-a17c-36a7f626356a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009594 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009611 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009629 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009660 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009678 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009678 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009695 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009714 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009735 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009755 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009771 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009788 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009804 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009823 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.009841 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010070 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010084 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010116 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010131 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e68ab3cb-c372-45d9-a758-beaf4c213714-metrics-certs\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010144 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010176 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010193 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-daemon-config\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010121 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-trusted-ca\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010199 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010266 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010280 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010282 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010306 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010325 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010365 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010387 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010410 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-config\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010411 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010469 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010516 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010546 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010571 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010593 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2fa86b-a966-49dc-8577-d2b54b111d14-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010620 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6c88187c-d011-4043-a6d3-4a8a7ec4e204-srv-cert\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010617 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010677 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010728 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.010695 master-0 kubenswrapper[19170]: I0313 01:19:01.010751 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010792 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010851 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010895 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010942 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010958 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4976e608-07a0-4cef-8fdd-7cec3324b4b5-images\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.010974 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011006 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgvg\" (UniqueName: \"kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg\") pod \"csi-snapshot-controller-7577d6f48-2slj5\" (UID: \"3d2e7338-a6d6-4872-ab72-a4e631075ab3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011010 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-config\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011035 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfc49699-9428-4bff-804d-da0e60551759-metrics-tls\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011041 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011076 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011157 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011200 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011226 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011251 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011278 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011298 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011304 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011337 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011365 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011390 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011415 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011423 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22587300-2448-4862-9fd8-68197d17a9f2-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011437 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011463 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-srv-cert\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011498 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011531 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011560 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rfg\" (UniqueName: \"kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011600 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d4e785-6663-417d-b380-6905773613c8-config\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011607 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-config\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011650 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011678 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011704 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011729 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011753 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011779 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011800 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011822 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011845 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011867 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011888 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011909 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011932 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011952 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.011976 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039acb44-a9b3-4ad6-a091-be4d18edc34f-config\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012021 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4976e608-07a0-4cef-8fdd-7cec3324b4b5-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012061 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012090 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012094 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95d4e785-6663-417d-b380-6905773613c8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012117 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012134 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21cbea73-f779-43e4-b5ba-d6fa06275d34-serving-cert\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012148 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012173 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012197 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.012272 master-0 kubenswrapper[19170]: I0313 01:19:01.012326 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012327 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012400 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012430 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012453 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012473 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012491 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cni-binary-copy\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012555 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012586 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012606 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-metrics-tls\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012613 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012660 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012665 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c6bf2d5-1881-4b63-b247-7e7426707fa1-config\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012687 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012689 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039acb44-a9b3-4ad6-a091-be4d18edc34f-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012714 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012809 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012829 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012854 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012879 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012917 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-operand-assets\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012968 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.012966 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b713fb-64ce-4a01-951c-1f31df62e1ae-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013004 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013132 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013161 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013189 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013216 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013239 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013260 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013283 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013307 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013330 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013353 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013378 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013408 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013429 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013451 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013494 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22587300-2448-4862-9fd8-68197d17a9f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013518 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013539 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f97819d0-2840-4352-a435-19ef1a8c22c9-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013546 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013569 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013599 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/78d2cd80-23b9-426d-a7ac-1daa27668a47-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013622 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013677 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013703 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013625 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4738c93d-62e6-44ce-a289-e646b9302e71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013726 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013753 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013777 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg77t\" (UniqueName: \"kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t\") pod \"migrator-57ccdf9b5-kxxzc\" (UID: \"64477504-5cb6-42dc-a7eb-662981daec4a\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013799 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013797 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21cbea73-f779-43e4-b5ba-d6fa06275d34-etcd-ca\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013846 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013869 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013881 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d56480e0-0885-41e5-a1fc-931a068fbadb-serving-cert\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013897 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013890 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6k6\" (UniqueName: \"kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013952 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013985 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014014 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.013987 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014088 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-metrics-tls\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014127 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014156 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014179 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014212 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014228 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014251 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.014225 master-0 kubenswrapper[19170]: I0313 01:19:01.014269 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014354 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014399 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9b713fb-64ce-4a01-951c-1f31df62e1ae-serving-cert\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014404 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014504 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014516 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58035e42-37d8-48f6-9861-9b4ce6014119-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014544 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014607 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d56480e0-0885-41e5-a1fc-931a068fbadb-available-featuregates\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014615 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014610 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58035e42-37d8-48f6-9861-9b4ce6014119-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014723 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014799 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014818 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c6bf2d5-1881-4b63-b247-7e7426707fa1-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014825 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014870 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014912 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.014935 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.017342 master-0 kubenswrapper[19170]: I0313 01:19:01.015070 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f97819d0-2840-4352-a435-19ef1a8c22c9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:01.023120 master-0 kubenswrapper[19170]: I0313 01:19:01.023087 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 01:19:01.033019 master-0 kubenswrapper[19170]: I0313 01:19:01.032992 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-config\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.033141 master-0 kubenswrapper[19170]: I0313 01:19:01.033120 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.037413 master-0 kubenswrapper[19170]: E0313 01:19:01.036904 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.037413 master-0 kubenswrapper[19170]: E0313 01:19:01.037301 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.038186 master-0 kubenswrapper[19170]: E0313 01:19:01.037902 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.038186 master-0 kubenswrapper[19170]: E0313 01:19:01.037992 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.042694 master-0 kubenswrapper[19170]: I0313 01:19:01.042494 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 01:19:01.045019 master-0 kubenswrapper[19170]: I0313 01:19:01.044980 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1308fba1-a50d-48b3-b272-7bef44727b7f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.053921 master-0 kubenswrapper[19170]: I0313 01:19:01.053531 19170 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 01:19:01.063432 master-0 kubenswrapper[19170]: I0313 01:19:01.063381 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 01:19:01.066006 master-0 kubenswrapper[19170]: I0313 01:19:01.065968 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1308fba1-a50d-48b3-b272-7bef44727b7f-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:01.071218 master-0 kubenswrapper[19170]: I0313 01:19:01.071172 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-env-overrides\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.082491 master-0 kubenswrapper[19170]: I0313 01:19:01.082459 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 01:19:01.103012 master-0 kubenswrapper[19170]: I0313 01:19:01.102981 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 01:19:01.115808 master-0 kubenswrapper[19170]: I0313 01:19:01.115715 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.116050 master-0 kubenswrapper[19170]: I0313 01:19:01.115828 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:19:01.116050 master-0 kubenswrapper[19170]: I0313 01:19:01.115866 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44dw\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.116134 master-0 kubenswrapper[19170]: I0313 01:19:01.116064 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.117056 master-0 kubenswrapper[19170]: I0313 01:19:01.117006 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gmv\" (UniqueName: \"kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:01.117113 master-0 kubenswrapper[19170]: I0313 01:19:01.117058 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.117113 master-0 kubenswrapper[19170]: I0313 01:19:01.117080 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.117208 master-0 kubenswrapper[19170]: I0313 01:19:01.117115 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:01.117208 master-0 kubenswrapper[19170]: I0313 01:19:01.117148 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.117208 master-0 kubenswrapper[19170]: I0313 01:19:01.117169 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:01.117208 master-0 kubenswrapper[19170]: I0313 01:19:01.117198 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117219 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117247 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117268 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117289 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117309 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117343 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt79p\" (UniqueName: \"kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.117367 master-0 kubenswrapper[19170]: I0313 01:19:01.117371 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117391 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117413 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjrm\" (UniqueName: \"kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117437 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117464 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117486 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117509 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117531 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8zl\" (UniqueName: \"kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117569 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117594 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117613 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117656 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117680 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117703 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.117726 master-0 kubenswrapper[19170]: I0313 01:19:01.117724 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.117753 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.117774 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.117796 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.117902 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.117962 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-netd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.118008 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.118029 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-netns\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.118115 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"b8addabb4549d659b9d15764990d3747\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:01.118219 master-0 kubenswrapper[19170]: I0313 01:19:01.118133 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.118538 master-0 kubenswrapper[19170]: I0313 01:19:01.118259 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-log-socket\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.118538 master-0 kubenswrapper[19170]: I0313 01:19:01.118283 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2c4c579b-0643-47ac-a729-017c326b0ecc-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.118538 master-0 kubenswrapper[19170]: I0313 01:19:01.118493 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-textfile\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.118731 master-0 kubenswrapper[19170]: I0313 01:19:01.118690 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-cnibin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.118800 master-0 kubenswrapper[19170]: I0313 01:19:01.118775 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:01.118852 master-0 kubenswrapper[19170]: I0313 01:19:01.118840 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.118890 master-0 kubenswrapper[19170]: I0313 01:19:01.118872 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:01.118969 master-0 kubenswrapper[19170]: I0313 01:19:01.118935 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.119015 master-0 kubenswrapper[19170]: I0313 01:19:01.118977 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:19:01.119052 master-0 kubenswrapper[19170]: I0313 01:19:01.119017 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.119091 master-0 kubenswrapper[19170]: I0313 01:19:01.119056 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrb2\" (UniqueName: \"kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:01.119153 master-0 kubenswrapper[19170]: I0313 01:19:01.119128 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.119195 master-0 kubenswrapper[19170]: I0313 01:19:01.119172 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.119241 master-0 kubenswrapper[19170]: I0313 01:19:01.119205 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:01.119285 master-0 kubenswrapper[19170]: I0313 01:19:01.119251 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.119326 master-0 kubenswrapper[19170]: I0313 01:19:01.119286 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqjr\" (UniqueName: \"kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.119364 master-0 kubenswrapper[19170]: I0313 01:19:01.119320 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szx9m\" (UniqueName: \"kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:01.119485 master-0 kubenswrapper[19170]: I0313 01:19:01.119445 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:01.119537 master-0 kubenswrapper[19170]: I0313 01:19:01.119491 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.119588 master-0 kubenswrapper[19170]: I0313 01:19:01.119530 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9fb\" (UniqueName: \"kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:19:01.119588 master-0 kubenswrapper[19170]: I0313 01:19:01.119565 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.119699 master-0 kubenswrapper[19170]: I0313 01:19:01.119596 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.119699 master-0 kubenswrapper[19170]: I0313 01:19:01.119663 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:01.119778 master-0 kubenswrapper[19170]: I0313 01:19:01.119701 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.119778 master-0 kubenswrapper[19170]: I0313 01:19:01.119730 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.119778 master-0 kubenswrapper[19170]: I0313 01:19:01.119768 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.119891 master-0 kubenswrapper[19170]: I0313 01:19:01.119814 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.119891 master-0 kubenswrapper[19170]: I0313 01:19:01.119849 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:19:01.120096 master-0 kubenswrapper[19170]: I0313 01:19:01.120057 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.120142 master-0 kubenswrapper[19170]: I0313 01:19:01.120101 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.120189 master-0 kubenswrapper[19170]: I0313 01:19:01.120136 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlks\" (UniqueName: \"kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:01.120189 master-0 kubenswrapper[19170]: I0313 01:19:01.120174 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.120283 master-0 kubenswrapper[19170]: I0313 01:19:01.120208 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd849\" (UniqueName: \"kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.120283 master-0 kubenswrapper[19170]: I0313 01:19:01.120252 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.120364 master-0 kubenswrapper[19170]: I0313 01:19:01.120284 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.120364 master-0 kubenswrapper[19170]: I0313 01:19:01.120339 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.120501 master-0 kubenswrapper[19170]: I0313 01:19:01.120374 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6vn\" (UniqueName: \"kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:01.120501 master-0 kubenswrapper[19170]: I0313 01:19:01.120407 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:01.120501 master-0 kubenswrapper[19170]: I0313 01:19:01.120481 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.120613 master-0 kubenswrapper[19170]: I0313 01:19:01.120527 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.120613 master-0 kubenswrapper[19170]: I0313 01:19:01.120559 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.120613 master-0 kubenswrapper[19170]: I0313 01:19:01.120592 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:01.120750 master-0 kubenswrapper[19170]: I0313 01:19:01.120623 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:01.120750 master-0 kubenswrapper[19170]: I0313 01:19:01.120679 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.121045 master-0 kubenswrapper[19170]: I0313 01:19:01.121016 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.121119 master-0 kubenswrapper[19170]: I0313 01:19:01.121059 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.121119 master-0 kubenswrapper[19170]: I0313 01:19:01.121101 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.121209 master-0 kubenswrapper[19170]: I0313 01:19:01.121134 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.121209 master-0 kubenswrapper[19170]: I0313 01:19:01.121152 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-catalog-content\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:01.121209 master-0 kubenswrapper[19170]: I0313 01:19:01.121171 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.121333 master-0 kubenswrapper[19170]: I0313 01:19:01.121226 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.121333 master-0 kubenswrapper[19170]: I0313 01:19:01.121257 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:19:01.121333 master-0 kubenswrapper[19170]: I0313 01:19:01.121282 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/30f7537e-93ed-466b-ba24-78141d004b2f-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.121333 master-0 kubenswrapper[19170]: I0313 01:19:01.121284 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f82n\" (UniqueName: \"kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:01.121493 master-0 kubenswrapper[19170]: I0313 01:19:01.121339 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6frm\" (UniqueName: \"kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:01.121493 master-0 kubenswrapper[19170]: I0313 01:19:01.121387 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.121493 master-0 kubenswrapper[19170]: I0313 01:19:01.121440 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.121493 master-0 kubenswrapper[19170]: I0313 01:19:01.121470 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.121681 master-0 kubenswrapper[19170]: I0313 01:19:01.121506 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.121681 master-0 kubenswrapper[19170]: I0313 01:19:01.121553 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sqh\" (UniqueName: \"kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.121681 master-0 kubenswrapper[19170]: I0313 01:19:01.121614 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-systemd\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.121681 master-0 kubenswrapper[19170]: I0313 01:19:01.121659 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.121844 master-0 kubenswrapper[19170]: I0313 01:19:01.121756 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/46662e51-44af-4732-83a1-9509a579b373-host-slash\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.121844 master-0 kubenswrapper[19170]: I0313 01:19:01.121792 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.121844 master-0 kubenswrapper[19170]: I0313 01:19:01.121816 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.121972 master-0 kubenswrapper[19170]: I0313 01:19:01.121914 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-utilities\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:01.121972 master-0 kubenswrapper[19170]: I0313 01:19:01.121944 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.122122 master-0 kubenswrapper[19170]: I0313 01:19:01.122077 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.122122 master-0 kubenswrapper[19170]: I0313 01:19:01.122107 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.122226 master-0 kubenswrapper[19170]: I0313 01:19:01.122119 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-kubelet\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.122226 master-0 kubenswrapper[19170]: I0313 01:19:01.122178 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-slash\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.122226 master-0 kubenswrapper[19170]: I0313 01:19:01.122210 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.122356 master-0 kubenswrapper[19170]: I0313 01:19:01.122236 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.122356 master-0 kubenswrapper[19170]: I0313 01:19:01.122300 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-var-lib-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.122356 master-0 kubenswrapper[19170]: I0313 01:19:01.122333 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-utilities\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:01.122356 master-0 kubenswrapper[19170]: I0313 01:19:01.122348 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-cni-bin\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.122501 master-0 kubenswrapper[19170]: I0313 01:19:01.122466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.122501 master-0 kubenswrapper[19170]: I0313 01:19:01.122476 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.122582 master-0 kubenswrapper[19170]: I0313 01:19:01.122540 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-tmp\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.122582 master-0 kubenswrapper[19170]: I0313 01:19:01.122580 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.122686 master-0 kubenswrapper[19170]: I0313 01:19:01.122591 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:01.122686 master-0 kubenswrapper[19170]: I0313 01:19:01.122606 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-multus\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.122766 master-0 kubenswrapper[19170]: I0313 01:19:01.122684 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.122766 master-0 kubenswrapper[19170]: I0313 01:19:01.122717 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnml\" (UniqueName: \"kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:01.122766 master-0 kubenswrapper[19170]: I0313 01:19:01.122761 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.122930 master-0 kubenswrapper[19170]: I0313 01:19:01.122900 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.122980 master-0 kubenswrapper[19170]: I0313 01:19:01.122950 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-ready\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.123026 master-0 kubenswrapper[19170]: I0313 01:19:01.122997 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.123242 master-0 kubenswrapper[19170]: I0313 01:19:01.123198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.124156 master-0 kubenswrapper[19170]: I0313 01:19:01.124053 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.124223 master-0 kubenswrapper[19170]: I0313 01:19:01.124186 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.124223 master-0 kubenswrapper[19170]: I0313 01:19:01.124211 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:01.124223 master-0 kubenswrapper[19170]: I0313 01:19:01.124217 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 01:19:01.124223 master-0 kubenswrapper[19170]: I0313 01:19:01.124234 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.124398 master-0 kubenswrapper[19170]: I0313 01:19:01.124257 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:19:01.124398 master-0 kubenswrapper[19170]: I0313 01:19:01.124279 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.124398 master-0 kubenswrapper[19170]: I0313 01:19:01.124304 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.124398 master-0 kubenswrapper[19170]: I0313 01:19:01.124340 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:01.124398 master-0 kubenswrapper[19170]: I0313 01:19:01.124364 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.124609 master-0 kubenswrapper[19170]: I0313 01:19:01.124404 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.124609 master-0 kubenswrapper[19170]: I0313 01:19:01.124551 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-system-cni-dir\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.124711 master-0 kubenswrapper[19170]: I0313 01:19:01.124618 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-socket-dir-parent\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.124711 master-0 kubenswrapper[19170]: I0313 01:19:01.124697 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-k8s-cni-cncf-io\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.124790 master-0 kubenswrapper[19170]: I0313 01:19:01.124739 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-etc-openvswitch\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.124861 master-0 kubenswrapper[19170]: I0313 01:19:01.124836 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-catalog-content\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:01.125024 master-0 kubenswrapper[19170]: I0313 01:19:01.124857 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:01.125024 master-0 kubenswrapper[19170]: I0313 01:19:01.124892 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.125024 master-0 kubenswrapper[19170]: I0313 01:19:01.124928 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.125024 master-0 kubenswrapper[19170]: I0313 01:19:01.124953 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.125024 master-0 kubenswrapper[19170]: I0313 01:19:01.124960 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125053 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/93871019-3d0c-4081-9afe-19b6dd108ec6-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125113 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125127 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-node-pullsecrets\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125144 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwxhc\" (UniqueName: \"kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125185 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125202 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:19:01.125220 master-0 kubenswrapper[19170]: I0313 01:19:01.125222 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125235 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125252 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125281 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125307 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125332 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125387 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125437 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.125463 master-0 kubenswrapper[19170]: I0313 01:19:01.125461 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125482 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125502 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-kubelet\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125511 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125542 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-hostroot\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125609 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125648 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125672 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125696 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125725 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-929r9\" (UniqueName: \"kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125733 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d23bbaec-b635-4649-b26e-2829f32d21f0-catalog-content\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125747 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.125774 master-0 kubenswrapper[19170]: I0313 01:19:01.125780 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125834 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/bfc49699-9428-4bff-804d-da0e60551759-host-etc-kube\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125874 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125898 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125921 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125933 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125943 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.125966 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126024 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126052 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126028 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-tuned\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126076 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126110 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmkr\" (UniqueName: \"kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126134 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v79j\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126159 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126189 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126218 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126243 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:01.126250 master-0 kubenswrapper[19170]: I0313 01:19:01.126264 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126286 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126313 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-utilities\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126308 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126379 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126431 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126441 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-utilities\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126466 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126501 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-system-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126515 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126533 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit-dir\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126542 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126349 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-run-ovn\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126589 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltxpc\" (UniqueName: \"kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126671 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftn5x\" (UniqueName: \"kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126704 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hhwp\" (UniqueName: \"kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126758 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nfk\" (UniqueName: \"kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126782 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.126883 master-0 kubenswrapper[19170]: I0313 01:19:01.126850 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.126909 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.126933 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127003 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127025 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127080 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127115 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127184 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-var-lib-cni-bin\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127201 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127228 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-mcd-auth-proxy-config\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127230 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127271 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127129 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-os-release\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127160 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127342 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsdm\" (UniqueName: \"kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127364 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-cnibin\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127393 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-systemd-units\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127400 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127452 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-node-log\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127490 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9728b4-4e1e-4165-a276-3daa00e95839-catalog-content\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:01.127501 master-0 kubenswrapper[19170]: I0313 01:19:01.127493 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127541 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127579 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127655 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127685 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-etc-kubernetes\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127700 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127728 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvtc\" (UniqueName: \"kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127765 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127802 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127824 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-cni-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127830 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127861 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127899 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7wj9\" (UniqueName: \"kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127926 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127965 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.127968 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128003 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128035 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-898lt\" (UniqueName: \"kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128060 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128129 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128159 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128183 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128007 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-multus-conf-dir\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128207 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128229 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128254 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:01.128279 master-0 kubenswrapper[19170]: I0313 01:19:01.128264 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/77fd9062-0f7d-4255-92ca-7e4325daeddd-snapshots\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128327 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128364 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128386 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2937cbe2-3125-4c3f-96f8-2febeb5942cc-host-run-multus-certs\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128403 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128493 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128480 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128539 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128582 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5hs\" (UniqueName: \"kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128610 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128659 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128686 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128707 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128763 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2ce47660-f7cc-4669-a00d-83422f0f6d55-tmpfs\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128760 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128800 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128822 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128842 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128872 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-ovn-kubernetes\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128900 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128927 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-host-run-netns\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128937 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128960 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.128996 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4738c93d-62e6-44ce-a289-e646b9302e71-os-release\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.129105 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:19:01.129216 master-0 kubenswrapper[19170]: I0313 01:19:01.129166 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 01:19:01.143450 master-0 kubenswrapper[19170]: I0313 01:19:01.143387 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 01:19:01.163272 master-0 kubenswrapper[19170]: I0313 01:19:01.163214 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 01:19:01.164352 master-0 kubenswrapper[19170]: I0313 01:19:01.164301 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-webhook-cert\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.183772 master-0 kubenswrapper[19170]: I0313 01:19:01.183598 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 01:19:01.192382 master-0 kubenswrapper[19170]: I0313 01:19:01.192319 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-env-overrides\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.203424 master-0 kubenswrapper[19170]: I0313 01:19:01.203376 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 01:19:01.211912 master-0 kubenswrapper[19170]: I0313 01:19:01.211862 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-ovnkube-identity-cm\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:01.223005 master-0 kubenswrapper[19170]: I0313 01:19:01.222958 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 01:19:01.229737 master-0 kubenswrapper[19170]: I0313 01:19:01.229691 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.229856 master-0 kubenswrapper[19170]: I0313 01:19:01.229825 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.229959 master-0 kubenswrapper[19170]: I0313 01:19:01.229929 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.230037 master-0 kubenswrapper[19170]: I0313 01:19:01.229994 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.230085 master-0 kubenswrapper[19170]: I0313 01:19:01.230064 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.230120 master-0 kubenswrapper[19170]: I0313 01:19:01.230069 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-sys\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.230172 master-0 kubenswrapper[19170]: I0313 01:19:01.230151 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-root\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.231335 master-0 kubenswrapper[19170]: I0313 01:19:01.231297 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.231459 master-0 kubenswrapper[19170]: I0313 01:19:01.231426 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:01.231553 master-0 kubenswrapper[19170]: I0313 01:19:01.231533 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.231593 master-0 kubenswrapper[19170]: I0313 01:19:01.231551 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.231937 master-0 kubenswrapper[19170]: I0313 01:19:01.231898 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.232032 master-0 kubenswrapper[19170]: I0313 01:19:01.232001 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232079 master-0 kubenswrapper[19170]: I0313 01:19:01.232051 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.232079 master-0 kubenswrapper[19170]: I0313 01:19:01.232070 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.232142 master-0 kubenswrapper[19170]: I0313 01:19:01.232125 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232185 master-0 kubenswrapper[19170]: I0313 01:19:01.232164 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232222 master-0 kubenswrapper[19170]: I0313 01:19:01.232181 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.232222 master-0 kubenswrapper[19170]: I0313 01:19:01.232210 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-lib-modules\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232274 master-0 kubenswrapper[19170]: I0313 01:19:01.232230 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-kubernetes\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232274 master-0 kubenswrapper[19170]: I0313 01:19:01.232262 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232329 master-0 kubenswrapper[19170]: I0313 01:19:01.232262 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-var-lib-kubelet\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232464 master-0 kubenswrapper[19170]: I0313 01:19:01.232435 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:19:01.232525 master-0 kubenswrapper[19170]: I0313 01:19:01.232428 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232672 master-0 kubenswrapper[19170]: I0313 01:19:01.232652 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81bcb58-efe3-4577-8e88-67f92c645f6f-hosts-file\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:19:01.232751 master-0 kubenswrapper[19170]: I0313 01:19:01.232702 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232789 master-0 kubenswrapper[19170]: I0313 01:19:01.232764 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysconfig\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.232827 master-0 kubenswrapper[19170]: I0313 01:19:01.232805 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.232932 master-0 kubenswrapper[19170]: I0313 01:19:01.232906 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/2c4c579b-0643-47ac-a729-017c326b0ecc-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.233361 master-0 kubenswrapper[19170]: I0313 01:19:01.233325 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.233458 master-0 kubenswrapper[19170]: I0313 01:19:01.233425 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.233563 master-0 kubenswrapper[19170]: I0313 01:19:01.233477 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-sysctl-conf\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.233563 master-0 kubenswrapper[19170]: I0313 01:19:01.233531 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:01.233563 master-0 kubenswrapper[19170]: I0313 01:19:01.233543 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.233674 master-0 kubenswrapper[19170]: I0313 01:19:01.233597 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.233674 master-0 kubenswrapper[19170]: I0313 01:19:01.233611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-modprobe-d\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.233779 master-0 kubenswrapper[19170]: I0313 01:19:01.233736 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-wtmp\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:01.233930 master-0 kubenswrapper[19170]: I0313 01:19:01.233890 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.234138 master-0 kubenswrapper[19170]: I0313 01:19:01.234106 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-sys\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.234332 master-0 kubenswrapper[19170]: I0313 01:19:01.234293 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.234504 master-0 kubenswrapper[19170]: I0313 01:19:01.234477 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.234572 master-0 kubenswrapper[19170]: I0313 01:19:01.234549 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.234739 master-0 kubenswrapper[19170]: I0313 01:19:01.234714 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.234778 master-0 kubenswrapper[19170]: I0313 01:19:01.234753 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-rootfs\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:01.234808 master-0 kubenswrapper[19170]: I0313 01:19:01.234782 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.234896 master-0 kubenswrapper[19170]: I0313 01:19:01.234866 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.234961 master-0 kubenswrapper[19170]: I0313 01:19:01.234939 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.234996 master-0 kubenswrapper[19170]: I0313 01:19:01.234962 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-etc-systemd\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.235038 master-0 kubenswrapper[19170]: I0313 01:19:01.235014 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.235132 master-0 kubenswrapper[19170]: I0313 01:19:01.235106 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.235168 master-0 kubenswrapper[19170]: I0313 01:19:01.235060 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-host\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.235203 master-0 kubenswrapper[19170]: I0313 01:19:01.235174 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.235234 master-0 kubenswrapper[19170]: I0313 01:19:01.235203 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a5ab1d5-dabd-45e7-a688-71a282f61e67-run\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:01.235234 master-0 kubenswrapper[19170]: I0313 01:19:01.235221 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57eb2020-1560-4352-8b86-76db59de933a-audit-dir\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.235562 master-0 kubenswrapper[19170]: I0313 01:19:01.235524 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/85149f21-7ba8-4891-82ef-0fef3d5d7863-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:01.243192 master-0 kubenswrapper[19170]: I0313 01:19:01.243149 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 01:19:01.263405 master-0 kubenswrapper[19170]: I0313 01:19:01.263338 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:19:01.264024 master-0 kubenswrapper[19170]: I0313 01:19:01.263984 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovn-node-metrics-cert\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.283772 master-0 kubenswrapper[19170]: I0313 01:19:01.283613 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:19:01.292837 master-0 kubenswrapper[19170]: I0313 01:19:01.292777 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-ovnkube-script-lib\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:01.302252 master-0 kubenswrapper[19170]: I0313 01:19:01.302196 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 01:19:01.322594 master-0 kubenswrapper[19170]: I0313 01:19:01.322548 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 01:19:01.342173 master-0 kubenswrapper[19170]: I0313 01:19:01.342130 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 01:19:01.343473 master-0 kubenswrapper[19170]: I0313 01:19:01.343449 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/46662e51-44af-4732-83a1-9509a579b373-iptables-alerter-script\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:01.362646 master-0 kubenswrapper[19170]: I0313 01:19:01.362587 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 01:19:01.383134 master-0 kubenswrapper[19170]: I0313 01:19:01.383091 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 01:19:01.390138 master-0 kubenswrapper[19170]: I0313 01:19:01.390101 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-cabundle\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:01.402588 master-0 kubenswrapper[19170]: I0313 01:19:01.402546 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 01:19:01.403774 master-0 kubenswrapper[19170]: I0313 01:19:01.403743 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/da44d750-31e5-46f4-b3ef-dd4384c22aaf-signing-key\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:01.427569 master-0 kubenswrapper[19170]: I0313 01:19:01.427511 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 01:19:01.443162 master-0 kubenswrapper[19170]: I0313 01:19:01.443052 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 01:19:01.451044 master-0 kubenswrapper[19170]: I0313 01:19:01.451019 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-audit\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.463218 master-0 kubenswrapper[19170]: I0313 01:19:01.463171 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:19:01.484928 master-0 kubenswrapper[19170]: I0313 01:19:01.484706 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:19:01.488965 master-0 kubenswrapper[19170]: I0313 01:19:01.488929 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-client\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.503451 master-0 kubenswrapper[19170]: I0313 01:19:01.503409 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:19:01.514553 master-0 kubenswrapper[19170]: I0313 01:19:01.514489 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-serving-cert\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.522758 master-0 kubenswrapper[19170]: I0313 01:19:01.522721 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:19:01.529817 master-0 kubenswrapper[19170]: I0313 01:19:01.529771 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-etcd-serving-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.542524 master-0 kubenswrapper[19170]: I0313 01:19:01.542466 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 01:19:01.552093 master-0 kubenswrapper[19170]: I0313 01:19:01.552044 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/738ebdcd-b78b-495a-b8f2-84af11a7d35c-encryption-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.562063 master-0 kubenswrapper[19170]: I0313 01:19:01.562009 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:19:01.571691 master-0 kubenswrapper[19170]: I0313 01:19:01.571615 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-image-import-ca\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.582524 master-0 kubenswrapper[19170]: I0313 01:19:01.582487 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:19:01.583883 master-0 kubenswrapper[19170]: I0313 01:19:01.583839 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-config\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.609099 master-0 kubenswrapper[19170]: I0313 01:19:01.609066 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:19:01.615697 master-0 kubenswrapper[19170]: I0313 01:19:01.615669 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738ebdcd-b78b-495a-b8f2-84af11a7d35c-trusted-ca-bundle\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:01.623327 master-0 kubenswrapper[19170]: I0313 01:19:01.623285 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:19:01.642538 master-0 kubenswrapper[19170]: I0313 01:19:01.642497 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 01:19:01.669835 master-0 kubenswrapper[19170]: I0313 01:19:01.669760 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 01:19:01.682837 master-0 kubenswrapper[19170]: I0313 01:19:01.682776 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 01:19:01.688796 master-0 kubenswrapper[19170]: I0313 01:19:01.688761 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:01.702933 master-0 kubenswrapper[19170]: I0313 01:19:01.702861 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 01:19:01.707444 master-0 kubenswrapper[19170]: I0313 01:19:01.707416 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/2c4c579b-0643-47ac-a729-017c326b0ecc-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.722466 master-0 kubenswrapper[19170]: I0313 01:19:01.722429 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 01:19:01.755194 master-0 kubenswrapper[19170]: I0313 01:19:01.755149 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 01:19:01.763174 master-0 kubenswrapper[19170]: I0313 01:19:01.763130 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 01:19:01.773617 master-0 kubenswrapper[19170]: I0313 01:19:01.773578 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:01.783539 master-0 kubenswrapper[19170]: I0313 01:19:01.783494 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 01:19:01.789723 master-0 kubenswrapper[19170]: I0313 01:19:01.789446 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-audit-policies\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.803371 master-0 kubenswrapper[19170]: I0313 01:19:01.803318 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 01:19:01.807137 master-0 kubenswrapper[19170]: I0313 01:19:01.807105 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-encryption-config\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.822863 master-0 kubenswrapper[19170]: I0313 01:19:01.822812 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 01:19:01.830129 master-0 kubenswrapper[19170]: I0313 01:19:01.830089 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-trusted-ca-bundle\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.842951 master-0 kubenswrapper[19170]: I0313 01:19:01.842926 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 01:19:01.846089 master-0 kubenswrapper[19170]: I0313 01:19:01.845936 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-etcd-client\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.855194 master-0 kubenswrapper[19170]: I0313 01:19:01.854775 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.862380 master-0 kubenswrapper[19170]: I0313 01:19:01.862349 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 01:19:01.863668 master-0 kubenswrapper[19170]: I0313 01:19:01.863643 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:01.866058 master-0 kubenswrapper[19170]: I0313 01:19:01.866028 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/57eb2020-1560-4352-8b86-76db59de933a-etcd-serving-ca\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.884064 master-0 kubenswrapper[19170]: I0313 01:19:01.884022 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 01:19:01.893126 master-0 kubenswrapper[19170]: I0313 01:19:01.893089 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57eb2020-1560-4352-8b86-76db59de933a-serving-cert\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:01.902676 master-0 kubenswrapper[19170]: I0313 01:19:01.902623 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 01:19:01.922828 master-0 kubenswrapper[19170]: I0313 01:19:01.922770 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 01:19:01.942774 master-0 kubenswrapper[19170]: I0313 01:19:01.942499 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 01:19:01.956473 master-0 kubenswrapper[19170]: I0313 01:19:01.956393 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:19:01.956473 master-0 kubenswrapper[19170]: I0313 01:19:01.956439 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:19:01.956748 master-0 kubenswrapper[19170]: I0313 01:19:01.956727 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock" (OuterVolumeSpecName: "var-lock") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:01.956986 master-0 kubenswrapper[19170]: I0313 01:19:01.956915 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:01.958805 master-0 kubenswrapper[19170]: I0313 01:19:01.958752 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:01.958805 master-0 kubenswrapper[19170]: I0313 01:19:01.958793 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/690f916b-6f87-42d9-8168-392a9177bee9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:01.961669 master-0 kubenswrapper[19170]: I0313 01:19:01.961650 19170 request.go:700] Waited for 1.007468996s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Mar 13 01:19:01.962821 master-0 kubenswrapper[19170]: I0313 01:19:01.962803 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 01:19:01.968989 master-0 kubenswrapper[19170]: I0313 01:19:01.968957 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58405741-598c-4bf5-bbc8-1ca8e3f10995-metrics-tls\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:01.985604 master-0 kubenswrapper[19170]: I0313 01:19:01.984879 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 01:19:01.993934 master-0 kubenswrapper[19170]: I0313 01:19:01.993877 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58405741-598c-4bf5-bbc8-1ca8e3f10995-config-volume\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:02.003608 master-0 kubenswrapper[19170]: I0313 01:19:02.003569 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 01:19:02.022292 master-0 kubenswrapper[19170]: I0313 01:19:02.022242 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 01:19:02.027530 master-0 kubenswrapper[19170]: I0313 01:19:02.027330 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-stats-auth\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:02.042745 master-0 kubenswrapper[19170]: I0313 01:19:02.042693 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 01:19:02.046603 master-0 kubenswrapper[19170]: I0313 01:19:02.046558 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-metrics-certs\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:02.062245 master-0 kubenswrapper[19170]: I0313 01:19:02.062204 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 01:19:02.082596 master-0 kubenswrapper[19170]: I0313 01:19:02.082552 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 01:19:02.092072 master-0 kubenswrapper[19170]: I0313 01:19:02.092027 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0671fdd0-b358-40f9-ae49-2c5a9004edb3-default-certificate\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:02.102549 master-0 kubenswrapper[19170]: I0313 01:19:02.102515 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 01:19:02.116930 master-0 kubenswrapper[19170]: E0313 01:19:02.116875 19170 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.117103 master-0 kubenswrapper[19170]: E0313 01:19:02.116999 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config podName:b3a9c0f6-cfde-4ae8-952a-00e2fb862482 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.616980862 +0000 UTC m=+3.425101822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" (UID: "b3a9c0f6-cfde-4ae8-952a-00e2fb862482") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.117276 master-0 kubenswrapper[19170]: E0313 01:19:02.117240 19170 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.117372 master-0 kubenswrapper[19170]: E0313 01:19:02.117282 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates podName:4d313ee4-3bb9-44a9-ad80-8e00540ef1e7 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.61727409 +0000 UTC m=+3.425395050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates") pod "prometheus-operator-admission-webhook-8464df8497-kjwgg" (UID: "4d313ee4-3bb9-44a9-ad80-8e00540ef1e7") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.118160 master-0 kubenswrapper[19170]: E0313 01:19:02.118110 19170 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.118160 master-0 kubenswrapper[19170]: E0313 01:19:02.118149 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config podName:33dfdc31-54a4-4249-99ae-a15180514659 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.618141093 +0000 UTC m=+3.426262053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config") pod "machine-approver-754bdc9f9d-knlw8" (UID: "33dfdc31-54a4-4249-99ae-a15180514659") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118180 19170 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118204 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls podName:33dfdc31-54a4-4249-99ae-a15180514659 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.618197754 +0000 UTC m=+3.426318714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls") pod "machine-approver-754bdc9f9d-knlw8" (UID: "33dfdc31-54a4-4249-99ae-a15180514659") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118223 19170 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118233 19170 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118255 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.618249935 +0000 UTC m=+3.426370895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.118349 master-0 kubenswrapper[19170]: E0313 01:19:02.118280 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca podName:631f5719-2083-4c99-92cb-2ddc04022d86 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.618262926 +0000 UTC m=+3.426383886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca") pod "controller-manager-757fb68448-cj9p5" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.119454 master-0 kubenswrapper[19170]: E0313 01:19:02.119416 19170 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.119454 master-0 kubenswrapper[19170]: E0313 01:19:02.119423 19170 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.119454 master-0 kubenswrapper[19170]: E0313 01:19:02.119451 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert podName:77fd9062-0f7d-4255-92ca-7e4325daeddd nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.619442677 +0000 UTC m=+3.427563627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert") pod "insights-operator-8f89dfddd-6k2t7" (UID: "77fd9062-0f7d-4255-92ca-7e4325daeddd") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.119757 master-0 kubenswrapper[19170]: E0313 01:19:02.119468 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config podName:a9462e2e-728d-4076-a876-31dbbd637581 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.619460637 +0000 UTC m=+3.427581597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config") pod "route-controller-manager-5dc55b5d9c-nlg6m" (UID: "a9462e2e-728d-4076-a876-31dbbd637581") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.119757 master-0 kubenswrapper[19170]: E0313 01:19:02.119497 19170 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.119757 master-0 kubenswrapper[19170]: E0313 01:19:02.119522 19170 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.119757 master-0 kubenswrapper[19170]: E0313 01:19:02.119750 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config podName:e2b5ad07-fa01-4330-9dce-6da3444657ab nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.619682953 +0000 UTC m=+3.427803923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-5ff8674d55-6fh8b" (UID: "e2b5ad07-fa01-4330-9dce-6da3444657ab") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.120013 master-0 kubenswrapper[19170]: E0313 01:19:02.119779 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca podName:90e6e63d-3cf2-4bb5-883f-6219a0b52c3a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.619768655 +0000 UTC m=+3.427889625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca") pod "cloud-credential-operator-55d85b7b47-qslvf" (UID: "90e6e63d-3cf2-4bb5-883f-6219a0b52c3a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.121789 master-0 kubenswrapper[19170]: E0313 01:19:02.121748 19170 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.121897 master-0 kubenswrapper[19170]: E0313 01:19:02.121804 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls podName:cada5bf2-e208-4fd8-bdf5-de8cad31a665 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.621790719 +0000 UTC m=+3.429911779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6686554ddc-w6qs7" (UID: "cada5bf2-e208-4fd8-bdf5-de8cad31a665") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.121897 master-0 kubenswrapper[19170]: E0313 01:19:02.121826 19170 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.121897 master-0 kubenswrapper[19170]: E0313 01:19:02.121850 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls podName:ebf338e6-9725-47d9-8c7f-adbf11a44406 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.62184405 +0000 UTC m=+3.429965010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-2xfpz" (UID: "ebf338e6-9725-47d9-8c7f-adbf11a44406") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.121926 19170 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.121950 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images podName:b3a9c0f6-cfde-4ae8-952a-00e2fb862482 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.621943253 +0000 UTC m=+3.430064213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" (UID: "b3a9c0f6-cfde-4ae8-952a-00e2fb862482") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.121970 19170 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.121999 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls podName:e5956ebf-01e4-4d4c-ae6d-b0995905c6d3 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.621991064 +0000 UTC m=+3.430112144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls") pod "machine-config-daemon-pmkpj" (UID: "e5956ebf-01e4-4d4c-ae6d-b0995905c6d3") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.122044 19170 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.122080 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle podName:77fd9062-0f7d-4255-92ca-7e4325daeddd nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622069836 +0000 UTC m=+3.430190906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle") pod "insights-operator-8f89dfddd-6k2t7" (UID: "77fd9062-0f7d-4255-92ca-7e4325daeddd") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.122091 19170 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122114 master-0 kubenswrapper[19170]: E0313 01:19:02.122120 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122155 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist podName:ac2a4c90-32db-4464-8c47-acbcafbcd5d0 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622135478 +0000 UTC m=+3.430256478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-hdx2d" (UID: "ac2a4c90-32db-4464-8c47-acbcafbcd5d0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122170 19170 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122180 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622168579 +0000 UTC m=+3.430289569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122199 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle podName:0671fdd0-b358-40f9-ae49-2c5a9004edb3 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622191159 +0000 UTC m=+3.430312229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle") pod "router-default-79f8cd6fdd-cnrhm" (UID: "0671fdd0-b358-40f9-ae49-2c5a9004edb3") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122218 19170 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122228 19170 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122257 19170 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122259 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622247901 +0000 UTC m=+3.430368891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122295 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca podName:85149f21-7ba8-4891-82ef-0fef3d5d7863 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622286252 +0000 UTC m=+3.430407352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca") pod "cluster-version-operator-8c9c967c7-4d6fw" (UID: "85149f21-7ba8-4891-82ef-0fef3d5d7863") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.122577 master-0 kubenswrapper[19170]: E0313 01:19:02.122308 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config podName:cd7cca05-3da7-42cf-af64-6e94050e58c0 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.622301242 +0000 UTC m=+3.430422332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-74cc79fd76-6btfg" (UID: "cd7cca05-3da7-42cf-af64-6e94050e58c0") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.123575 master-0 kubenswrapper[19170]: E0313 01:19:02.123523 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.123706 master-0 kubenswrapper[19170]: E0313 01:19:02.123595 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.623575946 +0000 UTC m=+3.431696946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.123706 master-0 kubenswrapper[19170]: E0313 01:19:02.123669 19170 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.123839 master-0 kubenswrapper[19170]: E0313 01:19:02.123712 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls podName:cd7cca05-3da7-42cf-af64-6e94050e58c0 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.623699939 +0000 UTC m=+3.431820929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls") pod "openshift-state-metrics-74cc79fd76-6btfg" (UID: "cd7cca05-3da7-42cf-af64-6e94050e58c0") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.123910 master-0 kubenswrapper[19170]: E0313 01:19:02.123847 19170 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.123910 master-0 kubenswrapper[19170]: E0313 01:19:02.123893 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert podName:85149f21-7ba8-4891-82ef-0fef3d5d7863 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.623881904 +0000 UTC m=+3.432002894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert") pod "cluster-version-operator-8c9c967c7-4d6fw" (UID: "85149f21-7ba8-4891-82ef-0fef3d5d7863") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.124572 master-0 kubenswrapper[19170]: E0313 01:19:02.124528 19170 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.124711 master-0 kubenswrapper[19170]: E0313 01:19:02.124588 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls podName:1c24e17c-8bd9-4c23-9876-6f31c9da5cd1 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.624576992 +0000 UTC m=+3.432697952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-zt229" (UID: "1c24e17c-8bd9-4c23-9876-6f31c9da5cd1") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.124711 master-0 kubenswrapper[19170]: E0313 01:19:02.124592 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.124711 master-0 kubenswrapper[19170]: E0313 01:19:02.124617 19170 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.124711 master-0 kubenswrapper[19170]: E0313 01:19:02.124668 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.624652854 +0000 UTC m=+3.432773844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.124711 master-0 kubenswrapper[19170]: E0313 01:19:02.124691 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs podName:2679c6e1-11c1-450c-b03a-30d7ee59ff6f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.624682265 +0000 UTC m=+3.432803225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs") pod "multus-admission-controller-7769569c45-zm2jl" (UID: "2679c6e1-11c1-450c-b03a-30d7ee59ff6f") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.125048 master-0 kubenswrapper[19170]: E0313 01:19:02.125025 19170 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.125110 master-0 kubenswrapper[19170]: E0313 01:19:02.125075 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token podName:edda0d03-fdb2-4130-8f73-8057efd5815c nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.625058344 +0000 UTC m=+3.433179304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token") pod "machine-config-server-4gpcz" (UID: "edda0d03-fdb2-4130-8f73-8057efd5815c") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126023 master-0 kubenswrapper[19170]: E0313 01:19:02.125962 19170 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126023 master-0 kubenswrapper[19170]: E0313 01:19:02.126004 19170 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126171 master-0 kubenswrapper[19170]: E0313 01:19:02.126042 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls podName:93871019-3d0c-4081-9afe-19b6dd108ec6 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.62602453 +0000 UTC m=+3.434145490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls") pod "machine-config-controller-ff46b7bdf-g7wfh" (UID: "93871019-3d0c-4081-9afe-19b6dd108ec6") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126171 master-0 kubenswrapper[19170]: E0313 01:19:02.126072 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert podName:d278ed70-786c-4b6c-9f04-f08ede704569 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626056131 +0000 UTC m=+3.434177121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert") pod "cluster-autoscaler-operator-69576476f7-2q4qb" (UID: "d278ed70-786c-4b6c-9f04-f08ede704569") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126171 master-0 kubenswrapper[19170]: E0313 01:19:02.126135 19170 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126171 master-0 kubenswrapper[19170]: E0313 01:19:02.126173 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626165064 +0000 UTC m=+3.434286024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126429 master-0 kubenswrapper[19170]: E0313 01:19:02.126183 19170 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126429 master-0 kubenswrapper[19170]: E0313 01:19:02.126211 19170 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.126429 master-0 kubenswrapper[19170]: E0313 01:19:02.126227 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert podName:631f5719-2083-4c99-92cb-2ddc04022d86 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626215325 +0000 UTC m=+3.434336325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert") pod "controller-manager-757fb68448-cj9p5" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126429 master-0 kubenswrapper[19170]: E0313 01:19:02.126250 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle podName:77fd9062-0f7d-4255-92ca-7e4325daeddd nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626240706 +0000 UTC m=+3.434361696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle") pod "insights-operator-8f89dfddd-6k2t7" (UID: "77fd9062-0f7d-4255-92ca-7e4325daeddd") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126491 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126505 19170 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-b1qe6h41gh39q: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126531 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca podName:cd7cca05-3da7-42cf-af64-6e94050e58c0 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626519843 +0000 UTC m=+3.434640843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca") pod "openshift-state-metrics-74cc79fd76-6btfg" (UID: "cd7cca05-3da7-42cf-af64-6e94050e58c0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126593 19170 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126624 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert podName:2ce47660-f7cc-4669-a00d-83422f0f6d55 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626613955 +0000 UTC m=+3.434734915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert") pod "packageserver-68f6795949-v9w8g" (UID: "2ce47660-f7cc-4669-a00d-83422f0f6d55") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126657 19170 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126671 19170 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126695 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs podName:edda0d03-fdb2-4130-8f73-8057efd5815c nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626684347 +0000 UTC m=+3.434805347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs") pod "machine-config-server-4gpcz" (UID: "edda0d03-fdb2-4130-8f73-8057efd5815c") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126719 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626709728 +0000 UTC m=+3.434830718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.126839 master-0 kubenswrapper[19170]: E0313 01:19:02.126740 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert podName:90e6e63d-3cf2-4bb5-883f-6219a0b52c3a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.626730358 +0000 UTC m=+3.434851358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-55d85b7b47-qslvf" (UID: "90e6e63d-3cf2-4bb5-883f-6219a0b52c3a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.127443 master-0 kubenswrapper[19170]: E0313 01:19:02.127380 19170 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.127443 master-0 kubenswrapper[19170]: E0313 01:19:02.127427 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls podName:b3a9c0f6-cfde-4ae8-952a-00e2fb862482 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.627416766 +0000 UTC m=+3.435537726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" (UID: "b3a9c0f6-cfde-4ae8-952a-00e2fb862482") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.127565 master-0 kubenswrapper[19170]: E0313 01:19:02.127477 19170 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.127565 master-0 kubenswrapper[19170]: E0313 01:19:02.127529 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles podName:631f5719-2083-4c99-92cb-2ddc04022d86 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.627518029 +0000 UTC m=+3.435638989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles") pod "controller-manager-757fb68448-cj9p5" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.127732 master-0 kubenswrapper[19170]: E0313 01:19:02.127541 19170 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.127898 master-0 kubenswrapper[19170]: E0313 01:19:02.127858 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.127898 master-0 kubenswrapper[19170]: E0313 01:19:02.127879 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.627828277 +0000 UTC m=+3.435949267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.127898 master-0 kubenswrapper[19170]: E0313 01:19:02.127903 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca podName:e2b5ad07-fa01-4330-9dce-6da3444657ab nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.627894769 +0000 UTC m=+3.436015729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca") pod "prometheus-operator-5ff8674d55-6fh8b" (UID: "e2b5ad07-fa01-4330-9dce-6da3444657ab") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.128148 master-0 kubenswrapper[19170]: E0313 01:19:02.128120 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.128248 master-0 kubenswrapper[19170]: E0313 01:19:02.128154 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.628147356 +0000 UTC m=+3.436268316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.129891 master-0 kubenswrapper[19170]: E0313 01:19:02.129840 19170 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.129891 master-0 kubenswrapper[19170]: E0313 01:19:02.129859 19170 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.129891 master-0 kubenswrapper[19170]: E0313 01:19:02.129879 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert podName:a9462e2e-728d-4076-a876-31dbbd637581 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.629870381 +0000 UTC m=+3.437991341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert") pod "route-controller-manager-5dc55b5d9c-nlg6m" (UID: "a9462e2e-728d-4076-a876-31dbbd637581") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.129891 master-0 kubenswrapper[19170]: E0313 01:19:02.129903 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images podName:1c24e17c-8bd9-4c23-9876-6f31c9da5cd1 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.629894872 +0000 UTC m=+3.438015832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images") pod "machine-api-operator-84bf6db4f9-zt229" (UID: "1c24e17c-8bd9-4c23-9876-6f31c9da5cd1") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.129931 19170 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.129948 19170 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.129930 19170 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.129981 19170 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130009 19170 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130011 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130035 19170 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130044 19170 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130051 19170 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.129960 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca podName:a9462e2e-728d-4076-a876-31dbbd637581 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.629951523 +0000 UTC m=+3.438072603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca") pod "route-controller-manager-5dc55b5d9c-nlg6m" (UID: "a9462e2e-728d-4076-a876-31dbbd637581") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130075 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls podName:e2b5ad07-fa01-4330-9dce-6da3444657ab nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630067366 +0000 UTC m=+3.438188316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-6fh8b" (UID: "e2b5ad07-fa01-4330-9dce-6da3444657ab") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130088 19170 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130098 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config podName:1c24e17c-8bd9-4c23-9876-6f31c9da5cd1 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630081377 +0000 UTC m=+3.438202367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config") pod "machine-api-operator-84bf6db4f9-zt229" (UID: "1c24e17c-8bd9-4c23-9876-6f31c9da5cd1") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130124 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630112497 +0000 UTC m=+3.438233487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130143 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630134048 +0000 UTC m=+3.438255038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130165 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert podName:2ce47660-f7cc-4669-a00d-83422f0f6d55 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630153628 +0000 UTC m=+3.438274628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert") pod "packageserver-68f6795949-v9w8g" (UID: "2ce47660-f7cc-4669-a00d-83422f0f6d55") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130187 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630176989 +0000 UTC m=+3.438297979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130206 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config podName:631f5719-2083-4c99-92cb-2ddc04022d86 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.63019746 +0000 UTC m=+3.438318450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config") pod "controller-manager-757fb68448-cj9p5" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130237 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config podName:d278ed70-786c-4b6c-9f04-f08ede704569 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.63022628 +0000 UTC m=+3.438347270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config") pod "cluster-autoscaler-operator-69576476f7-2q4qb" (UID: "d278ed70-786c-4b6c-9f04-f08ede704569") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.130813 master-0 kubenswrapper[19170]: E0313 01:19:02.130260 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config podName:33dfdc31-54a4-4249-99ae-a15180514659 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:02.630250411 +0000 UTC m=+3.438371411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config") pod "machine-approver-754bdc9f9d-knlw8" (UID: "33dfdc31-54a4-4249-99ae-a15180514659") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:02.132065 master-0 kubenswrapper[19170]: I0313 01:19:02.131197 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:19:02.142338 master-0 kubenswrapper[19170]: I0313 01:19:02.142160 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:19:02.162670 master-0 kubenswrapper[19170]: I0313 01:19:02.162588 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:19:02.183101 master-0 kubenswrapper[19170]: I0313 01:19:02.183058 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 01:19:02.203741 master-0 kubenswrapper[19170]: I0313 01:19:02.203478 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:19:02.223040 master-0 kubenswrapper[19170]: I0313 01:19:02.222910 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:19:02.242721 master-0 kubenswrapper[19170]: I0313 01:19:02.242657 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-bjp2n" Mar 13 01:19:02.263077 master-0 kubenswrapper[19170]: I0313 01:19:02.262999 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-5qbrz" Mar 13 01:19:02.283050 master-0 kubenswrapper[19170]: I0313 01:19:02.282985 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:19:02.301938 master-0 kubenswrapper[19170]: I0313 01:19:02.301875 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:19:02.322820 master-0 kubenswrapper[19170]: I0313 01:19:02.322767 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:19:02.342780 master-0 kubenswrapper[19170]: I0313 01:19:02.342729 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:19:02.362719 master-0 kubenswrapper[19170]: I0313 01:19:02.362650 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-c7thh" Mar 13 01:19:02.383323 master-0 kubenswrapper[19170]: I0313 01:19:02.383239 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-b6b87" Mar 13 01:19:02.403154 master-0 kubenswrapper[19170]: I0313 01:19:02.403103 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 01:19:02.423748 master-0 kubenswrapper[19170]: I0313 01:19:02.423707 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 01:19:02.443144 master-0 kubenswrapper[19170]: I0313 01:19:02.443086 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:19:02.463351 master-0 kubenswrapper[19170]: I0313 01:19:02.463266 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 01:19:02.483023 master-0 kubenswrapper[19170]: I0313 01:19:02.482904 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-6h5r7" Mar 13 01:19:02.504587 master-0 kubenswrapper[19170]: I0313 01:19:02.504523 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 01:19:02.522607 master-0 kubenswrapper[19170]: I0313 01:19:02.522521 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 01:19:02.542469 master-0 kubenswrapper[19170]: I0313 01:19:02.542400 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-7q6zv" Mar 13 01:19:02.563302 master-0 kubenswrapper[19170]: I0313 01:19:02.563244 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6f5hw" Mar 13 01:19:02.583749 master-0 kubenswrapper[19170]: I0313 01:19:02.583628 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:19:02.603449 master-0 kubenswrapper[19170]: I0313 01:19:02.603379 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:19:02.622990 master-0 kubenswrapper[19170]: I0313 01:19:02.622911 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:19:02.645224 master-0 kubenswrapper[19170]: I0313 01:19:02.645155 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:19:02.663606 master-0 kubenswrapper[19170]: I0313 01:19:02.663539 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 01:19:02.674191 master-0 kubenswrapper[19170]: I0313 01:19:02.674137 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:19:02.674549 master-0 kubenswrapper[19170]: I0313 01:19:02.674460 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:02.674753 master-0 kubenswrapper[19170]: I0313 01:19:02.674562 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:02.674753 master-0 kubenswrapper[19170]: I0313 01:19:02.674603 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:19:02.674987 master-0 kubenswrapper[19170]: I0313 01:19:02.674846 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebf338e6-9725-47d9-8c7f-adbf11a44406-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:19:02.675105 master-0 kubenswrapper[19170]: I0313 01:19:02.675010 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:02.675218 master-0 kubenswrapper[19170]: I0313 01:19:02.675189 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85149f21-7ba8-4891-82ef-0fef3d5d7863-serving-cert\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:02.675362 master-0 kubenswrapper[19170]: I0313 01:19:02.675300 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:02.675477 master-0 kubenswrapper[19170]: I0313 01:19:02.675385 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:02.675685 master-0 kubenswrapper[19170]: I0313 01:19:02.675603 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.675880 master-0 kubenswrapper[19170]: I0313 01:19:02.675830 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:02.675984 master-0 kubenswrapper[19170]: I0313 01:19:02.675907 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:02.675984 master-0 kubenswrapper[19170]: I0313 01:19:02.675966 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-service-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.676131 master-0 kubenswrapper[19170]: I0313 01:19:02.676057 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.676131 master-0 kubenswrapper[19170]: I0313 01:19:02.676117 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:02.676371 master-0 kubenswrapper[19170]: I0313 01:19:02.676318 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:02.676371 master-0 kubenswrapper[19170]: I0313 01:19:02.676342 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d278ed70-786c-4b6c-9f04-f08ede704569-cert\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:02.676569 master-0 kubenswrapper[19170]: I0313 01:19:02.676434 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:02.676569 master-0 kubenswrapper[19170]: I0313 01:19:02.676476 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.676569 master-0 kubenswrapper[19170]: I0313 01:19:02.676512 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:02.676876 master-0 kubenswrapper[19170]: I0313 01:19:02.676575 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:02.676876 master-0 kubenswrapper[19170]: I0313 01:19:02.676727 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:02.677076 master-0 kubenswrapper[19170]: I0313 01:19:02.676918 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.677076 master-0 kubenswrapper[19170]: I0313 01:19:02.676987 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:02.677076 master-0 kubenswrapper[19170]: I0313 01:19:02.677031 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.677873 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.677949 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.678015 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.678076 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.678220 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:02.678461 master-0 kubenswrapper[19170]: I0313 01:19:02.678465 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.678551 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.678630 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.678743 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.678825 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.678945 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679001 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679073 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679174 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679243 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679305 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679415 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679435 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.679555 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680039 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680110 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680165 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680255 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680441 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.680560 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.681208 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.681529 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.681900 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.682066 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.682185 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.682209 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4d313ee4-3bb9-44a9-ad80-8e00540ef1e7-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-kjwgg\" (UID: \"4d313ee4-3bb9-44a9-ad80-8e00540ef1e7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.682300 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683047 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683190 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683362 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683488 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683580 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683746 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.683917 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.684037 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.684112 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.684299 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:02.685248 master-0 kubenswrapper[19170]: I0313 01:19:02.684384 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:02.688888 master-0 kubenswrapper[19170]: I0313 01:19:02.687381 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0671fdd0-b358-40f9-ae49-2c5a9004edb3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:02.688888 master-0 kubenswrapper[19170]: I0313 01:19:02.687821 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85149f21-7ba8-4891-82ef-0fef3d5d7863-service-ca\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:02.688888 master-0 kubenswrapper[19170]: I0313 01:19:02.688266 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cada5bf2-e208-4fd8-bdf5-de8cad31a665-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:19:02.701132 master-0 kubenswrapper[19170]: I0313 01:19:02.701066 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 01:19:02.704069 master-0 kubenswrapper[19170]: I0313 01:19:02.704028 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 01:19:02.710538 master-0 kubenswrapper[19170]: I0313 01:19:02.710397 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77fd9062-0f7d-4255-92ca-7e4325daeddd-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.723454 master-0 kubenswrapper[19170]: I0313 01:19:02.723348 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 01:19:02.743690 master-0 kubenswrapper[19170]: I0313 01:19:02.743344 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 01:19:02.754229 master-0 kubenswrapper[19170]: I0313 01:19:02.753560 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77fd9062-0f7d-4255-92ca-7e4325daeddd-serving-cert\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:02.763810 master-0 kubenswrapper[19170]: I0313 01:19:02.763748 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-7wmj8" Mar 13 01:19:02.783715 master-0 kubenswrapper[19170]: I0313 01:19:02.783651 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-sjhm5" Mar 13 01:19:02.804492 master-0 kubenswrapper[19170]: I0313 01:19:02.804429 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 01:19:02.807833 master-0 kubenswrapper[19170]: I0313 01:19:02.807775 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-webhook-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:02.813882 master-0 kubenswrapper[19170]: I0313 01:19:02.813840 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ce47660-f7cc-4669-a00d-83422f0f6d55-apiservice-cert\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:02.823167 master-0 kubenswrapper[19170]: I0313 01:19:02.823120 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-qcwkf" Mar 13 01:19:02.842967 master-0 kubenswrapper[19170]: I0313 01:19:02.842907 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-56ljs" Mar 13 01:19:02.861650 master-0 kubenswrapper[19170]: I0313 01:19:02.861564 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:02.863121 master-0 kubenswrapper[19170]: I0313 01:19:02.863078 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 01:19:02.883320 master-0 kubenswrapper[19170]: I0313 01:19:02.883260 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-zdrdj" Mar 13 01:19:02.903367 master-0 kubenswrapper[19170]: I0313 01:19:02.903303 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 01:19:02.906228 master-0 kubenswrapper[19170]: I0313 01:19:02.906189 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-webhook-certs\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:19:02.922427 master-0 kubenswrapper[19170]: I0313 01:19:02.922383 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 01:19:02.930655 master-0 kubenswrapper[19170]: I0313 01:19:02.930600 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:02.942941 master-0 kubenswrapper[19170]: I0313 01:19:02.942903 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 01:19:02.951674 master-0 kubenswrapper[19170]: I0313 01:19:02.951616 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dfdc31-54a4-4249-99ae-a15180514659-config\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:02.962713 master-0 kubenswrapper[19170]: I0313 01:19:02.962673 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 01:19:02.981703 master-0 kubenswrapper[19170]: I0313 01:19:02.981580 19170 request.go:700] Waited for 2.021209987s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dproxy-tls&limit=500&resourceVersion=0 Mar 13 01:19:02.983009 master-0 kubenswrapper[19170]: I0313 01:19:02.982945 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 01:19:02.988960 master-0 kubenswrapper[19170]: I0313 01:19:02.988911 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-proxy-tls\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:03.002977 master-0 kubenswrapper[19170]: I0313 01:19:03.002841 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6b6t5" Mar 13 01:19:03.022721 master-0 kubenswrapper[19170]: I0313 01:19:03.022610 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-bclr4" Mar 13 01:19:03.042926 master-0 kubenswrapper[19170]: I0313 01:19:03.042863 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 01:19:03.052667 master-0 kubenswrapper[19170]: I0313 01:19:03.052588 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/33dfdc31-54a4-4249-99ae-a15180514659-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:03.063675 master-0 kubenswrapper[19170]: I0313 01:19:03.063624 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 01:19:03.083153 master-0 kubenswrapper[19170]: I0313 01:19:03.083075 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-5fw2q" Mar 13 01:19:03.111713 master-0 kubenswrapper[19170]: I0313 01:19:03.111647 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 01:19:03.113542 master-0 kubenswrapper[19170]: I0313 01:19:03.113466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:03.123022 master-0 kubenswrapper[19170]: I0313 01:19:03.122981 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 01:19:03.126614 master-0 kubenswrapper[19170]: I0313 01:19:03.126548 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:03.144024 master-0 kubenswrapper[19170]: I0313 01:19:03.143966 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-n4b44" Mar 13 01:19:03.163386 master-0 kubenswrapper[19170]: I0313 01:19:03.163304 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 01:19:03.182745 master-0 kubenswrapper[19170]: I0313 01:19:03.182690 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 01:19:03.191040 master-0 kubenswrapper[19170]: I0313 01:19:03.190982 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-config\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:03.202749 master-0 kubenswrapper[19170]: I0313 01:19:03.202699 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jf79b" Mar 13 01:19:03.222196 master-0 kubenswrapper[19170]: I0313 01:19:03.222108 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 01:19:03.243097 master-0 kubenswrapper[19170]: I0313 01:19:03.243031 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-ps7fb" Mar 13 01:19:03.262189 master-0 kubenswrapper[19170]: I0313 01:19:03.262012 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 13 01:19:03.269719 master-0 kubenswrapper[19170]: I0313 01:19:03.269623 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:03.281281 master-0 kubenswrapper[19170]: I0313 01:19:03.281175 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:19:03.283961 master-0 kubenswrapper[19170]: I0313 01:19:03.283911 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 01:19:03.291806 master-0 kubenswrapper[19170]: I0313 01:19:03.291733 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-kjwgg" Mar 13 01:19:03.303156 master-0 kubenswrapper[19170]: I0313 01:19:03.303102 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 01:19:03.307475 master-0 kubenswrapper[19170]: I0313 01:19:03.307414 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93871019-3d0c-4081-9afe-19b6dd108ec6-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:03.325026 master-0 kubenswrapper[19170]: I0313 01:19:03.324911 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 01:19:03.332964 master-0 kubenswrapper[19170]: I0313 01:19:03.332905 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d278ed70-786c-4b6c-9f04-f08ede704569-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:03.348260 master-0 kubenswrapper[19170]: I0313 01:19:03.348175 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 01:19:03.349529 master-0 kubenswrapper[19170]: I0313 01:19:03.349477 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-certs\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:03.363200 master-0 kubenswrapper[19170]: I0313 01:19:03.363146 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qdr99" Mar 13 01:19:03.403881 master-0 kubenswrapper[19170]: I0313 01:19:03.403811 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 01:19:03.407882 master-0 kubenswrapper[19170]: I0313 01:19:03.407840 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/edda0d03-fdb2-4130-8f73-8057efd5815c-node-bootstrap-token\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:03.423760 master-0 kubenswrapper[19170]: I0313 01:19:03.423721 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 01:19:03.431321 master-0 kubenswrapper[19170]: I0313 01:19:03.431256 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-images\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:03.444011 master-0 kubenswrapper[19170]: I0313 01:19:03.443979 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 01:19:03.450505 master-0 kubenswrapper[19170]: I0313 01:19:03.450482 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:03.462602 master-0 kubenswrapper[19170]: I0313 01:19:03.462548 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vdkmw" Mar 13 01:19:03.484656 master-0 kubenswrapper[19170]: I0313 01:19:03.484575 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-2pmf7" Mar 13 01:19:03.503562 master-0 kubenswrapper[19170]: I0313 01:19:03.503508 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 01:19:03.508096 master-0 kubenswrapper[19170]: I0313 01:19:03.508026 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:03.523531 master-0 kubenswrapper[19170]: I0313 01:19:03.523400 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 01:19:03.533568 master-0 kubenswrapper[19170]: I0313 01:19:03.533352 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:03.543356 master-0 kubenswrapper[19170]: I0313 01:19:03.543245 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 01:19:03.551070 master-0 kubenswrapper[19170]: I0313 01:19:03.550992 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:03.563870 master-0 kubenswrapper[19170]: I0313 01:19:03.563811 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 01:19:03.568440 master-0 kubenswrapper[19170]: I0313 01:19:03.568397 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:03.584099 master-0 kubenswrapper[19170]: I0313 01:19:03.584052 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sb45h" Mar 13 01:19:03.602941 master-0 kubenswrapper[19170]: I0313 01:19:03.602862 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:19:03.623580 master-0 kubenswrapper[19170]: I0313 01:19:03.623524 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 01:19:03.633447 master-0 kubenswrapper[19170]: I0313 01:19:03.633384 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e2b5ad07-fa01-4330-9dce-6da3444657ab-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:03.643141 master-0 kubenswrapper[19170]: I0313 01:19:03.643105 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 01:19:03.648542 master-0 kubenswrapper[19170]: I0313 01:19:03.648504 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cd7cca05-3da7-42cf-af64-6e94050e58c0-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:03.662801 master-0 kubenswrapper[19170]: I0313 01:19:03.662754 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:19:03.675955 master-0 kubenswrapper[19170]: E0313 01:19:03.675901 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.676262 master-0 kubenswrapper[19170]: E0313 01:19:03.676210 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.676421 master-0 kubenswrapper[19170]: E0313 01:19:03.675926 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.676421 master-0 kubenswrapper[19170]: E0313 01:19:03.676374 19170 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.676744 master-0 kubenswrapper[19170]: E0313 01:19:03.676237 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.676189023 +0000 UTC m=+5.484310053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.676969 master-0 kubenswrapper[19170]: E0313 01:19:03.676941 19170 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-b1qe6h41gh39q: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.677248 master-0 kubenswrapper[19170]: E0313 01:19:03.677023 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca podName:cd7cca05-3da7-42cf-af64-6e94050e58c0 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.676992254 +0000 UTC m=+5.485113244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca") pod "openshift-state-metrics-74cc79fd76-6btfg" (UID: "cd7cca05-3da7-42cf-af64-6e94050e58c0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.677472 master-0 kubenswrapper[19170]: E0313 01:19:03.677449 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.677422975 +0000 UTC m=+5.485543965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.677716 master-0 kubenswrapper[19170]: E0313 01:19:03.677687 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.67762486 +0000 UTC m=+5.485745930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.677983 master-0 kubenswrapper[19170]: E0313 01:19:03.677940 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.677911488 +0000 UTC m=+5.486032488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.679296 master-0 kubenswrapper[19170]: E0313 01:19:03.679268 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.679483 master-0 kubenswrapper[19170]: E0313 01:19:03.679463 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.679441618 +0000 UTC m=+5.487562608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.679709 master-0 kubenswrapper[19170]: E0313 01:19:03.679681 19170 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.679977 master-0 kubenswrapper[19170]: E0313 01:19:03.679950 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config podName:04470d64-c6eb-4a62-ae75-2a1d3dfdd53a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.679918071 +0000 UTC m=+5.488039111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config") pod "node-exporter-2hgwj" (UID: "04470d64-c6eb-4a62-ae75-2a1d3dfdd53a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.680177 master-0 kubenswrapper[19170]: E0313 01:19:03.679707 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.680392 master-0 kubenswrapper[19170]: E0313 01:19:03.680364 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca podName:e2b5ad07-fa01-4330-9dce-6da3444657ab nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.680335922 +0000 UTC m=+5.488456982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca") pod "prometheus-operator-5ff8674d55-6fh8b" (UID: "e2b5ad07-fa01-4330-9dce-6da3444657ab") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.682335 master-0 kubenswrapper[19170]: E0313 01:19:03.682305 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.682551 master-0 kubenswrapper[19170]: E0313 01:19:03.682529 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.682507929 +0000 UTC m=+5.490628929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.683178 master-0 kubenswrapper[19170]: I0313 01:19:03.683141 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 01:19:03.687771 master-0 kubenswrapper[19170]: E0313 01:19:03.687715 19170 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.687937 master-0 kubenswrapper[19170]: E0313 01:19:03.687792 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.687773727 +0000 UTC m=+5.495894717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.687937 master-0 kubenswrapper[19170]: E0313 01:19:03.687915 19170 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.688145 master-0 kubenswrapper[19170]: E0313 01:19:03.688002 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.687975683 +0000 UTC m=+5.496096703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.688323 master-0 kubenswrapper[19170]: E0313 01:19:03.688287 19170 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.688583 master-0 kubenswrapper[19170]: E0313 01:19:03.688537 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config podName:30f7537e-93ed-466b-ba24-78141d004b2f nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.688509717 +0000 UTC m=+5.496630757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-68b88f8cb5-plwwd" (UID: "30f7537e-93ed-466b-ba24-78141d004b2f") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.688811 master-0 kubenswrapper[19170]: E0313 01:19:03.688605 19170 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.688991 master-0 kubenswrapper[19170]: E0313 01:19:03.688969 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images podName:b3a9c0f6-cfde-4ae8-952a-00e2fb862482 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.688947998 +0000 UTC m=+5.497068988 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" (UID: "b3a9c0f6-cfde-4ae8-952a-00e2fb862482") : failed to sync configmap cache: timed out waiting for the condition Mar 13 01:19:03.689172 master-0 kubenswrapper[19170]: E0313 01:19:03.688665 19170 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.689399 master-0 kubenswrapper[19170]: E0313 01:19:03.689367 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls podName:9db888f0-51b6-43cf-8337-69d2d5cc2b0a nodeName:}" failed. No retries permitted until 2026-03-13 01:19:04.689342869 +0000 UTC m=+5.497463939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls") pod "metrics-server-5575f756f4-hqr5q" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a") : failed to sync secret cache: timed out waiting for the condition Mar 13 01:19:03.693460 master-0 kubenswrapper[19170]: I0313 01:19:03.693370 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:03.693762 master-0 kubenswrapper[19170]: I0313 01:19:03.693715 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:03.703020 master-0 kubenswrapper[19170]: I0313 01:19:03.702980 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:03.703776 master-0 kubenswrapper[19170]: I0313 01:19:03.703714 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4lcm6" Mar 13 01:19:03.722815 master-0 kubenswrapper[19170]: I0313 01:19:03.722766 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 01:19:03.742894 master-0 kubenswrapper[19170]: I0313 01:19:03.742829 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 01:19:03.762929 master-0 kubenswrapper[19170]: I0313 01:19:03.762879 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 01:19:03.784845 master-0 kubenswrapper[19170]: I0313 01:19:03.783708 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-b1qe6h41gh39q" Mar 13 01:19:03.802323 master-0 kubenswrapper[19170]: I0313 01:19:03.802270 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-h6wj2" Mar 13 01:19:03.823233 master-0 kubenswrapper[19170]: I0313 01:19:03.823167 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 01:19:03.842765 master-0 kubenswrapper[19170]: I0313 01:19:03.842703 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 01:19:03.863476 master-0 kubenswrapper[19170]: I0313 01:19:03.863413 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-tkjm8" Mar 13 01:19:03.869829 master-0 kubenswrapper[19170]: I0313 01:19:03.869779 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:03.883212 master-0 kubenswrapper[19170]: I0313 01:19:03.883164 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 01:19:03.903291 master-0 kubenswrapper[19170]: I0313 01:19:03.903236 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 01:19:03.923693 master-0 kubenswrapper[19170]: I0313 01:19:03.923620 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 01:19:03.932755 master-0 kubenswrapper[19170]: I0313 01:19:03.932709 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:03.939733 master-0 kubenswrapper[19170]: I0313 01:19:03.939701 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:03.943754 master-0 kubenswrapper[19170]: I0313 01:19:03.943722 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 01:19:03.988191 master-0 kubenswrapper[19170]: I0313 01:19:03.988104 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvs5\" (UniqueName: \"kubernetes.io/projected/4c5174b9-ca9e-4917-ab3a-ca403ce4f017-kube-api-access-mmvs5\") pod \"cluster-node-tuning-operator-66c7586884-4m9c9\" (UID: \"4c5174b9-ca9e-4917-ab3a-ca403ce4f017\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-4m9c9" Mar 13 01:19:04.000989 master-0 kubenswrapper[19170]: I0313 01:19:04.000915 19170 request.go:700] Waited for 2.993646282s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 13 01:19:04.006445 master-0 kubenswrapper[19170]: I0313 01:19:04.006395 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mn4\" (UniqueName: \"kubernetes.io/projected/48375ae2-d4b4-4db4-b832-3e3db1834fb9-kube-api-access-q7mn4\") pod \"network-check-source-7c67b67d47-5fv6h\" (UID: \"48375ae2-d4b4-4db4-b832-3e3db1834fb9\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-5fv6h" Mar 13 01:19:04.026738 master-0 kubenswrapper[19170]: I0313 01:19:04.026625 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:04.050821 master-0 kubenswrapper[19170]: I0313 01:19:04.050679 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzklz\" (UniqueName: \"kubernetes.io/projected/f97819d0-2840-4352-a435-19ef1a8c22c9-kube-api-access-fzklz\") pod \"cluster-image-registry-operator-86d6d77c7c-jjdk8\" (UID: \"f97819d0-2840-4352-a435-19ef1a8c22c9\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-jjdk8" Mar 13 01:19:04.057834 master-0 kubenswrapper[19170]: I0313 01:19:04.057782 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcll\" (UniqueName: \"kubernetes.io/projected/95d4e785-6663-417d-b380-6905773613c8-kube-api-access-nhcll\") pod \"service-ca-operator-69b6fc6b88-2v42g\" (UID: \"95d4e785-6663-417d-b380-6905773613c8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-2v42g" Mar 13 01:19:04.087178 master-0 kubenswrapper[19170]: I0313 01:19:04.087091 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwm5w\" (UniqueName: \"kubernetes.io/projected/ca2fa86b-a966-49dc-8577-d2b54b111d14-kube-api-access-gwm5w\") pod \"cluster-storage-operator-6fbfc8dc8f-c2xl8\" (UID: \"ca2fa86b-a966-49dc-8577-d2b54b111d14\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-c2xl8" Mar 13 01:19:04.118106 master-0 kubenswrapper[19170]: I0313 01:19:04.118002 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxfj\" (UniqueName: \"kubernetes.io/projected/2937cbe2-3125-4c3f-96f8-2febeb5942cc-kube-api-access-spxfj\") pod \"multus-rvt5h\" (UID: \"2937cbe2-3125-4c3f-96f8-2febeb5942cc\") " pod="openshift-multus/multus-rvt5h" Mar 13 01:19:04.129726 master-0 kubenswrapper[19170]: I0313 01:19:04.129616 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg54c\" (UniqueName: \"kubernetes.io/projected/21cbea73-f779-43e4-b5ba-d6fa06275d34-kube-api-access-wg54c\") pod \"etcd-operator-5884b9cd56-h4kkj\" (UID: \"21cbea73-f779-43e4-b5ba-d6fa06275d34\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-h4kkj" Mar 13 01:19:04.146860 master-0 kubenswrapper[19170]: I0313 01:19:04.146795 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58035e42-37d8-48f6-9861-9b4ce6014119-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-v9pv6\" (UID: \"58035e42-37d8-48f6-9861-9b4ce6014119\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-v9pv6" Mar 13 01:19:04.164295 master-0 kubenswrapper[19170]: I0313 01:19:04.164157 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mlslx\" (UID: \"b61ae6f3-d8eb-4803-a0bf-8aab29c8bd35\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mlslx" Mar 13 01:19:04.189042 master-0 kubenswrapper[19170]: I0313 01:19:04.188988 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7x6\" (UniqueName: \"kubernetes.io/projected/916d9fc9-388b-4506-a17c-36a7f626356a-kube-api-access-jg7x6\") pod \"kube-storage-version-migrator-operator-7f65c457f5-v9nfg\" (UID: \"916d9fc9-388b-4506-a17c-36a7f626356a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-v9nfg" Mar 13 01:19:04.198890 master-0 kubenswrapper[19170]: I0313 01:19:04.198853 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxgm\" (UniqueName: \"kubernetes.io/projected/d5456c8b-3c98-4824-8700-a04e9c12fb2e-kube-api-access-mnxgm\") pod \"network-check-target-xs8pt\" (UID: \"d5456c8b-3c98-4824-8700-a04e9c12fb2e\") " pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:19:04.230039 master-0 kubenswrapper[19170]: I0313 01:19:04.229982 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zd92\" (UniqueName: \"kubernetes.io/projected/7f35cc1e-3376-4dbd-b215-2a32bf62cc71-kube-api-access-5zd92\") pod \"catalog-operator-7d9c49f57b-h46pz\" (UID: \"7f35cc1e-3376-4dbd-b215-2a32bf62cc71\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:04.249483 master-0 kubenswrapper[19170]: I0313 01:19:04.249395 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7v9\" (UniqueName: \"kubernetes.io/projected/7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71-kube-api-access-bz7v9\") pod \"cluster-monitoring-operator-674cbfbd9d-2tr2t\" (UID: \"7d5ef34c-4581-4b6c-95aa-d7d6ac60ea71\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-2tr2t" Mar 13 01:19:04.268157 master-0 kubenswrapper[19170]: I0313 01:19:04.268068 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh2bv\" (UniqueName: \"kubernetes.io/projected/13fac7b0-ce55-467d-9d0c-6a122d87cb3c-kube-api-access-wh2bv\") pod \"dns-operator-589895fbb7-qvl2k\" (UID: \"13fac7b0-ce55-467d-9d0c-6a122d87cb3c\") " pod="openshift-dns-operator/dns-operator-589895fbb7-qvl2k" Mar 13 01:19:04.288074 master-0 kubenswrapper[19170]: I0313 01:19:04.287991 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxf58\" (UniqueName: \"kubernetes.io/projected/8c6bf2d5-1881-4b63-b247-7e7426707fa1-kube-api-access-vxf58\") pod \"cluster-baremetal-operator-5cdb4c5598-47sjr\" (UID: \"8c6bf2d5-1881-4b63-b247-7e7426707fa1\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" Mar 13 01:19:04.298507 master-0 kubenswrapper[19170]: I0313 01:19:04.298443 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgvg\" (UniqueName: \"kubernetes.io/projected/3d2e7338-a6d6-4872-ab72-a4e631075ab3-kube-api-access-vkgvg\") pod \"csi-snapshot-controller-7577d6f48-2slj5\" (UID: \"3d2e7338-a6d6-4872-ab72-a4e631075ab3\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" Mar 13 01:19:04.327542 master-0 kubenswrapper[19170]: I0313 01:19:04.327409 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5n7m\" (UniqueName: \"kubernetes.io/projected/46662e51-44af-4732-83a1-9509a579b373-kube-api-access-m5n7m\") pod \"iptables-alerter-qclwv\" (UID: \"46662e51-44af-4732-83a1-9509a579b373\") " pod="openshift-network-operator/iptables-alerter-qclwv" Mar 13 01:19:04.350382 master-0 kubenswrapper[19170]: I0313 01:19:04.350277 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tvv\" (UniqueName: \"kubernetes.io/projected/486c7e33-3dd8-4a98-87e3-8216ee2e05ef-kube-api-access-74tvv\") pod \"openshift-apiserver-operator-799b6db4d7-mvmt2\" (UID: \"486c7e33-3dd8-4a98-87e3-8216ee2e05ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-mvmt2" Mar 13 01:19:04.359611 master-0 kubenswrapper[19170]: I0313 01:19:04.359545 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-bound-sa-token\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:04.379554 master-0 kubenswrapper[19170]: I0313 01:19:04.379464 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz8jz\" (UniqueName: \"kubernetes.io/projected/ae44526f-5858-42a0-ba77-3a22f171456f-kube-api-access-mz8jz\") pod \"csi-snapshot-controller-operator-5685fbc7d-7nstm\" (UID: \"ae44526f-5858-42a0-ba77-3a22f171456f\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-7nstm" Mar 13 01:19:04.399131 master-0 kubenswrapper[19170]: I0313 01:19:04.399032 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxctn\" (UniqueName: \"kubernetes.io/projected/78d2cd80-23b9-426d-a7ac-1daa27668a47-kube-api-access-mxctn\") pod \"marketplace-operator-64bf9778cb-dszg5\" (UID: \"78d2cd80-23b9-426d-a7ac-1daa27668a47\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:04.418662 master-0 kubenswrapper[19170]: I0313 01:19:04.418573 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rfg\" (UniqueName: \"kubernetes.io/projected/da44d750-31e5-46f4-b3ef-dd4384c22aaf-kube-api-access-n4rfg\") pod \"service-ca-84bfdbbb7f-qr9tk\" (UID: \"da44d750-31e5-46f4-b3ef-dd4384c22aaf\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-qr9tk" Mar 13 01:19:04.447158 master-0 kubenswrapper[19170]: I0313 01:19:04.447027 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkdbm\" (UniqueName: \"kubernetes.io/projected/039acb44-a9b3-4ad6-a091-be4d18edc34f-kube-api-access-kkdbm\") pod \"openshift-controller-manager-operator-8565d84698-sslxh\" (UID: \"039acb44-a9b3-4ad6-a091-be4d18edc34f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-sslxh" Mar 13 01:19:04.459031 master-0 kubenswrapper[19170]: I0313 01:19:04.458977 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdzjn\" (UniqueName: \"kubernetes.io/projected/bfc49699-9428-4bff-804d-da0e60551759-kube-api-access-zdzjn\") pod \"network-operator-7c649bf6d4-bdc4j\" (UID: \"bfc49699-9428-4bff-804d-da0e60551759\") " pod="openshift-network-operator/network-operator-7c649bf6d4-bdc4j" Mar 13 01:19:04.475175 master-0 kubenswrapper[19170]: I0313 01:19:04.475119 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhjj\" (UniqueName: \"kubernetes.io/projected/e68ab3cb-c372-45d9-a758-beaf4c213714-kube-api-access-zjhjj\") pod \"network-metrics-daemon-zh5fh\" (UID: \"e68ab3cb-c372-45d9-a758-beaf4c213714\") " pod="openshift-multus/network-metrics-daemon-zh5fh" Mar 13 01:19:04.499946 master-0 kubenswrapper[19170]: I0313 01:19:04.499846 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnf5\" (UniqueName: \"kubernetes.io/projected/6c88187c-d011-4043-a6d3-4a8a7ec4e204-kube-api-access-7mnf5\") pod \"olm-operator-d64cfc9db-8l7kq\" (UID: \"6c88187c-d011-4043-a6d3-4a8a7ec4e204\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:04.523714 master-0 kubenswrapper[19170]: I0313 01:19:04.523613 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22587300-2448-4862-9fd8-68197d17a9f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-wbmqn\" (UID: \"22587300-2448-4862-9fd8-68197d17a9f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-wbmqn" Mar 13 01:19:04.555116 master-0 kubenswrapper[19170]: I0313 01:19:04.555061 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29w76\" (UniqueName: \"kubernetes.io/projected/70c097a1-90d9-4344-b0ae-5a59ec2ad8ad-kube-api-access-29w76\") pod \"ingress-operator-677db989d6-kdn2l\" (UID: \"70c097a1-90d9-4344-b0ae-5a59ec2ad8ad\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-kdn2l" Mar 13 01:19:04.556750 master-0 kubenswrapper[19170]: I0313 01:19:04.556709 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2rb\" (UniqueName: \"kubernetes.io/projected/1308fba1-a50d-48b3-b272-7bef44727b7f-kube-api-access-zv2rb\") pod \"ovnkube-control-plane-66b55d57d-cjmvd\" (UID: \"1308fba1-a50d-48b3-b272-7bef44727b7f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" Mar 13 01:19:04.586700 master-0 kubenswrapper[19170]: I0313 01:19:04.586508 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg77t\" (UniqueName: \"kubernetes.io/projected/64477504-5cb6-42dc-a7eb-662981daec4a-kube-api-access-gg77t\") pod \"migrator-57ccdf9b5-kxxzc\" (UID: \"64477504-5cb6-42dc-a7eb-662981daec4a\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-kxxzc" Mar 13 01:19:04.607973 master-0 kubenswrapper[19170]: I0313 01:19:04.607914 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf2s\" (UniqueName: \"kubernetes.io/projected/61fb4b86-f978-4ae1-80bc-18d2f386cbc2-kube-api-access-lrf2s\") pod \"cluster-olm-operator-77899cf6d-ck7rt\" (UID: \"61fb4b86-f978-4ae1-80bc-18d2f386cbc2\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-ck7rt" Mar 13 01:19:04.619289 master-0 kubenswrapper[19170]: I0313 01:19:04.619243 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfdv\" (UniqueName: \"kubernetes.io/projected/4edb3e1a-9082-4fc2-ae6f-99d49c078a34-kube-api-access-6tfdv\") pod \"ovnkube-node-v56ct\" (UID: \"4edb3e1a-9082-4fc2-ae6f-99d49c078a34\") " pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:04.637326 master-0 kubenswrapper[19170]: I0313 01:19:04.637273 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6k6\" (UniqueName: \"kubernetes.io/projected/738ebdcd-b78b-495a-b8f2-84af11a7d35c-kube-api-access-tl6k6\") pod \"apiserver-69c74d8d69-jpj8z\" (UID: \"738ebdcd-b78b-495a-b8f2-84af11a7d35c\") " pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:04.664976 master-0 kubenswrapper[19170]: I0313 01:19:04.664909 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fppkf\" (UniqueName: \"kubernetes.io/projected/d56480e0-0885-41e5-a1fc-931a068fbadb-kube-api-access-fppkf\") pod \"openshift-config-operator-64488f9d78-bqmmf\" (UID: \"d56480e0-0885-41e5-a1fc-931a068fbadb\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:04.678963 master-0 kubenswrapper[19170]: I0313 01:19:04.678896 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gt69\" (UniqueName: \"kubernetes.io/projected/4738c93d-62e6-44ce-a289-e646b9302e71-kube-api-access-9gt69\") pod \"multus-additional-cni-plugins-xn5t5\" (UID: \"4738c93d-62e6-44ce-a289-e646b9302e71\") " pod="openshift-multus/multus-additional-cni-plugins-xn5t5" Mar 13 01:19:04.698994 master-0 kubenswrapper[19170]: I0313 01:19:04.698902 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2m48\" (UniqueName: \"kubernetes.io/projected/4976e608-07a0-4cef-8fdd-7cec3324b4b5-kube-api-access-w2m48\") pod \"machine-config-operator-fdb5c78b5-6slg8\" (UID: \"4976e608-07a0-4cef-8fdd-7cec3324b4b5\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-6slg8" Mar 13 01:19:04.731464 master-0 kubenswrapper[19170]: I0313 01:19:04.731406 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlst\" (UniqueName: \"kubernetes.io/projected/d1153bb3-30dd-458f-b0a4-c05358a8b3f8-kube-api-access-srlst\") pod \"network-node-identity-znqwc\" (UID: \"d1153bb3-30dd-458f-b0a4-c05358a8b3f8\") " pod="openshift-network-node-identity/network-node-identity-znqwc" Mar 13 01:19:04.736988 master-0 kubenswrapper[19170]: I0313 01:19:04.736941 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:04.737124 master-0 kubenswrapper[19170]: I0313 01:19:04.737031 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.737176 master-0 kubenswrapper[19170]: I0313 01:19:04.737150 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.737251 master-0 kubenswrapper[19170]: I0313 01:19:04.737216 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2b5ad07-fa01-4330-9dce-6da3444657ab-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:04.737251 master-0 kubenswrapper[19170]: I0313 01:19:04.737226 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.737341 master-0 kubenswrapper[19170]: I0313 01:19:04.737326 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.737448 master-0 kubenswrapper[19170]: I0313 01:19:04.737425 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:04.737597 master-0 kubenswrapper[19170]: I0313 01:19:04.737561 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.737668 master-0 kubenswrapper[19170]: I0313 01:19:04.737644 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.738017 master-0 kubenswrapper[19170]: I0313 01:19:04.737984 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.738113 master-0 kubenswrapper[19170]: I0313 01:19:04.738044 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.738240 master-0 kubenswrapper[19170]: I0313 01:19:04.738210 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.738297 master-0 kubenswrapper[19170]: I0313 01:19:04.738079 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.738418 master-0 kubenswrapper[19170]: I0313 01:19:04.738377 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.738468 master-0 kubenswrapper[19170]: I0313 01:19:04.738062 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:04.738514 master-0 kubenswrapper[19170]: I0313 01:19:04.738487 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.738554 master-0 kubenswrapper[19170]: I0313 01:19:04.738526 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.738655 master-0 kubenswrapper[19170]: I0313 01:19:04.738586 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.738721 master-0 kubenswrapper[19170]: I0313 01:19:04.738697 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:04.738762 master-0 kubenswrapper[19170]: I0313 01:19:04.738735 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.738905 master-0 kubenswrapper[19170]: I0313 01:19:04.738780 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.738905 master-0 kubenswrapper[19170]: I0313 01:19:04.738822 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-metrics-client-ca\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.738905 master-0 kubenswrapper[19170]: I0313 01:19:04.738890 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.739116 master-0 kubenswrapper[19170]: I0313 01:19:04.739084 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-node-exporter-tls\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:04.739193 master-0 kubenswrapper[19170]: I0313 01:19:04.739092 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.739193 master-0 kubenswrapper[19170]: I0313 01:19:04.739147 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd7cca05-3da7-42cf-af64-6e94050e58c0-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:04.739281 master-0 kubenswrapper[19170]: I0313 01:19:04.739192 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.739323 master-0 kubenswrapper[19170]: I0313 01:19:04.739294 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/30f7537e-93ed-466b-ba24-78141d004b2f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.747781 master-0 kubenswrapper[19170]: I0313 01:19:04.747749 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chtg\" (UniqueName: \"kubernetes.io/projected/f9b713fb-64ce-4a01-951c-1f31df62e1ae-kube-api-access-4chtg\") pod \"authentication-operator-7c6989d6c4-bxqp2\" (UID: \"f9b713fb-64ce-4a01-951c-1f31df62e1ae\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-bxqp2" Mar 13 01:19:04.758079 master-0 kubenswrapper[19170]: I0313 01:19:04.758034 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6w7\" (UniqueName: \"kubernetes.io/projected/0d4e6150-432c-4a11-b5a6-4d62dd701fc8-kube-api-access-gn6w7\") pod \"package-server-manager-854648ff6d-nrzpj\" (UID: \"0d4e6150-432c-4a11-b5a6-4d62dd701fc8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:04.785566 master-0 kubenswrapper[19170]: I0313 01:19:04.785492 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44dw\" (UniqueName: \"kubernetes.io/projected/2c4c579b-0643-47ac-a729-017c326b0ecc-kube-api-access-g44dw\") pod \"catalogd-controller-manager-7f8b8b6f4c-7fc8j\" (UID: \"2c4c579b-0643-47ac-a729-017c326b0ecc\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:04.796925 master-0 kubenswrapper[19170]: I0313 01:19:04.796880 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/85149f21-7ba8-4891-82ef-0fef3d5d7863-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-4d6fw\" (UID: \"85149f21-7ba8-4891-82ef-0fef3d5d7863\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-4d6fw" Mar 13 01:19:04.817876 master-0 kubenswrapper[19170]: I0313 01:19:04.817833 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8zl\" (UniqueName: \"kubernetes.io/projected/cd7cca05-3da7-42cf-af64-6e94050e58c0-kube-api-access-gx8zl\") pod \"openshift-state-metrics-74cc79fd76-6btfg\" (UID: \"cd7cca05-3da7-42cf-af64-6e94050e58c0\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-6btfg" Mar 13 01:19:04.835393 master-0 kubenswrapper[19170]: I0313 01:19:04.835352 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:04.837026 master-0 kubenswrapper[19170]: I0313 01:19:04.836937 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt79p\" (UniqueName: \"kubernetes.io/projected/e5956ebf-01e4-4d4c-ae6d-b0995905c6d3-kube-api-access-jt79p\") pod \"machine-config-daemon-pmkpj\" (UID: \"e5956ebf-01e4-4d4c-ae6d-b0995905c6d3\") " pod="openshift-machine-config-operator/machine-config-daemon-pmkpj" Mar 13 01:19:04.842088 master-0 kubenswrapper[19170]: I0313 01:19:04.842047 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" Mar 13 01:19:04.853240 master-0 kubenswrapper[19170]: I0313 01:19:04.853179 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:04.853313 master-0 kubenswrapper[19170]: I0313 01:19:04.853274 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:04.858874 master-0 kubenswrapper[19170]: I0313 01:19:04.858813 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gmv\" (UniqueName: \"kubernetes.io/projected/edda0d03-fdb2-4130-8f73-8057efd5815c-kube-api-access-h5gmv\") pod \"machine-config-server-4gpcz\" (UID: \"edda0d03-fdb2-4130-8f73-8057efd5815c\") " pod="openshift-machine-config-operator/machine-config-server-4gpcz" Mar 13 01:19:04.878879 master-0 kubenswrapper[19170]: I0313 01:19:04.878832 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:04.891492 master-0 kubenswrapper[19170]: I0313 01:19:04.891423 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f82n\" (UniqueName: \"kubernetes.io/projected/d278ed70-786c-4b6c-9f04-f08ede704569-kube-api-access-7f82n\") pod \"cluster-autoscaler-operator-69576476f7-2q4qb\" (UID: \"d278ed70-786c-4b6c-9f04-f08ede704569\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-2q4qb" Mar 13 01:19:04.904932 master-0 kubenswrapper[19170]: I0313 01:19:04.904224 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:04.910894 master-0 kubenswrapper[19170]: I0313 01:19:04.910863 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjrm\" (UniqueName: \"kubernetes.io/projected/30f7537e-93ed-466b-ba24-78141d004b2f-kube-api-access-jkjrm\") pod \"kube-state-metrics-68b88f8cb5-plwwd\" (UID: \"30f7537e-93ed-466b-ba24-78141d004b2f\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-plwwd" Mar 13 01:19:04.921672 master-0 kubenswrapper[19170]: I0313 01:19:04.921599 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szx9m\" (UniqueName: \"kubernetes.io/projected/d23bbaec-b635-4649-b26e-2829f32d21f0-kube-api-access-szx9m\") pod \"certified-operators-9zvz2\" (UID: \"d23bbaec-b635-4649-b26e-2829f32d21f0\") " pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:04.934618 master-0 kubenswrapper[19170]: I0313 01:19:04.934594 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:04.936891 master-0 kubenswrapper[19170]: I0313 01:19:04.936870 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sqh\" (UniqueName: \"kubernetes.io/projected/b3a9c0f6-cfde-4ae8-952a-00e2fb862482-kube-api-access-42sqh\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n\" (UID: \"b3a9c0f6-cfde-4ae8-952a-00e2fb862482\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" Mar 13 01:19:04.961619 master-0 kubenswrapper[19170]: I0313 01:19:04.961560 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"metrics-server-5575f756f4-hqr5q\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:04.974759 master-0 kubenswrapper[19170]: I0313 01:19:04.974710 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlks\" (UniqueName: \"kubernetes.io/projected/58405741-598c-4bf5-bbc8-1ca8e3f10995-kube-api-access-6qlks\") pod \"dns-default-26mfw\" (UID: \"58405741-598c-4bf5-bbc8-1ca8e3f10995\") " pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:04.997593 master-0 kubenswrapper[19170]: I0313 01:19:04.997541 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd849\" (UniqueName: \"kubernetes.io/projected/57eb2020-1560-4352-8b86-76db59de933a-kube-api-access-kd849\") pod \"apiserver-78885b775b-jrrjv\" (UID: \"57eb2020-1560-4352-8b86-76db59de933a\") " pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:05.001787 master-0 kubenswrapper[19170]: I0313 01:19:05.001718 19170 request.go:700] Waited for 3.879689436s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/serviceaccounts/operator/token Mar 13 01:19:05.014832 master-0 kubenswrapper[19170]: I0313 01:19:05.014734 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqjr\" (UniqueName: \"kubernetes.io/projected/77fd9062-0f7d-4255-92ca-7e4325daeddd-kube-api-access-vtqjr\") pod \"insights-operator-8f89dfddd-6k2t7\" (UID: \"77fd9062-0f7d-4255-92ca-7e4325daeddd\") " pod="openshift-insights/insights-operator-8f89dfddd-6k2t7" Mar 13 01:19:05.036151 master-0 kubenswrapper[19170]: I0313 01:19:05.035976 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6vn\" (UniqueName: \"kubernetes.io/projected/3f9728b4-4e1e-4165-a276-3daa00e95839-kube-api-access-xr6vn\") pod \"redhat-operators-k52lh\" (UID: \"3f9728b4-4e1e-4165-a276-3daa00e95839\") " pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:05.052141 master-0 kubenswrapper[19170]: I0313 01:19:05.052037 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:05.061200 master-0 kubenswrapper[19170]: I0313 01:19:05.061158 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9fb\" (UniqueName: \"kubernetes.io/projected/2679c6e1-11c1-450c-b03a-30d7ee59ff6f-kube-api-access-4n9fb\") pod \"multus-admission-controller-7769569c45-zm2jl\" (UID: \"2679c6e1-11c1-450c-b03a-30d7ee59ff6f\") " pod="openshift-multus/multus-admission-controller-7769569c45-zm2jl" Mar 13 01:19:05.061582 master-0 kubenswrapper[19170]: I0313 01:19:05.061540 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:19:05.065082 master-0 kubenswrapper[19170]: I0313 01:19:05.065053 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 01:19:05.074810 master-0 kubenswrapper[19170]: I0313 01:19:05.074755 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"route-controller-manager-5dc55b5d9c-nlg6m\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:05.094426 master-0 kubenswrapper[19170]: I0313 01:19:05.094254 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6frm\" (UniqueName: \"kubernetes.io/projected/2ce47660-f7cc-4669-a00d-83422f0f6d55-kube-api-access-d6frm\") pod \"packageserver-68f6795949-v9w8g\" (UID: \"2ce47660-f7cc-4669-a00d-83422f0f6d55\") " pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:05.115768 master-0 kubenswrapper[19170]: I0313 01:19:05.115665 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrb2\" (UniqueName: \"kubernetes.io/projected/e2b5ad07-fa01-4330-9dce-6da3444657ab-kube-api-access-rtrb2\") pod \"prometheus-operator-5ff8674d55-6fh8b\" (UID: \"e2b5ad07-fa01-4330-9dce-6da3444657ab\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-6fh8b" Mar 13 01:19:05.147083 master-0 kubenswrapper[19170]: I0313 01:19:05.147006 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnml\" (UniqueName: \"kubernetes.io/projected/a9c7c6a4-4f5b-4807-932c-1b0f53ceed22-kube-api-access-wwnml\") pod \"redhat-marketplace-z254g\" (UID: \"a9c7c6a4-4f5b-4807-932c-1b0f53ceed22\") " pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:05.154950 master-0 kubenswrapper[19170]: I0313 01:19:05.154308 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:05.154950 master-0 kubenswrapper[19170]: I0313 01:19:05.154383 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:05.154950 master-0 kubenswrapper[19170]: I0313 01:19:05.154783 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:05.156723 master-0 kubenswrapper[19170]: I0313 01:19:05.156659 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-26mfw" Mar 13 01:19:05.169309 master-0 kubenswrapper[19170]: I0313 01:19:05.169225 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwxhc\" (UniqueName: \"kubernetes.io/projected/04470d64-c6eb-4a62-ae75-2a1d3dfdd53a-kube-api-access-pwxhc\") pod \"node-exporter-2hgwj\" (UID: \"04470d64-c6eb-4a62-ae75-2a1d3dfdd53a\") " pod="openshift-monitoring/node-exporter-2hgwj" Mar 13 01:19:05.170335 master-0 kubenswrapper[19170]: I0313 01:19:05.170222 19170 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:19:05.177688 master-0 kubenswrapper[19170]: I0313 01:19:05.177600 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-929r9\" (UniqueName: \"kubernetes.io/projected/cada5bf2-e208-4fd8-bdf5-de8cad31a665-kube-api-access-929r9\") pod \"control-plane-machine-set-operator-6686554ddc-w6qs7\" (UID: \"cada5bf2-e208-4fd8-bdf5-de8cad31a665\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" Mar 13 01:19:05.198970 master-0 kubenswrapper[19170]: I0313 01:19:05.198882 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"controller-manager-757fb68448-cj9p5\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:05.228193 master-0 kubenswrapper[19170]: I0313 01:19:05.228131 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmkr\" (UniqueName: \"kubernetes.io/projected/a561a1d1-b20f-45fd-9e0c-ee4399a1d31b-kube-api-access-7gmkr\") pod \"community-operators-bbptx\" (UID: \"a561a1d1-b20f-45fd-9e0c-ee4399a1d31b\") " pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:05.238442 master-0 kubenswrapper[19170]: I0313 01:19:05.238394 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v79j\" (UniqueName: \"kubernetes.io/projected/30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3-kube-api-access-6v79j\") pod \"operator-controller-controller-manager-6598bfb6c4-2wh5w\" (UID: \"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:05.252363 master-0 kubenswrapper[19170]: I0313 01:19:05.252309 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltxpc\" (UniqueName: \"kubernetes.io/projected/ebf338e6-9725-47d9-8c7f-adbf11a44406-kube-api-access-ltxpc\") pod \"cluster-samples-operator-664cb58b85-2xfpz\" (UID: \"ebf338e6-9725-47d9-8c7f-adbf11a44406\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-2xfpz" Mar 13 01:19:05.272792 master-0 kubenswrapper[19170]: I0313 01:19:05.271337 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:05.276162 master-0 kubenswrapper[19170]: I0313 01:19:05.276093 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hhwp\" (UniqueName: \"kubernetes.io/projected/90e6e63d-3cf2-4bb5-883f-6219a0b52c3a-kube-api-access-6hhwp\") pod \"cloud-credential-operator-55d85b7b47-qslvf\" (UID: \"90e6e63d-3cf2-4bb5-883f-6219a0b52c3a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qslvf" Mar 13 01:19:05.279137 master-0 kubenswrapper[19170]: I0313 01:19:05.279070 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:05.295080 master-0 kubenswrapper[19170]: I0313 01:19:05.294997 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nfk\" (UniqueName: \"kubernetes.io/projected/93871019-3d0c-4081-9afe-19b6dd108ec6-kube-api-access-s9nfk\") pod \"machine-config-controller-ff46b7bdf-g7wfh\" (UID: \"93871019-3d0c-4081-9afe-19b6dd108ec6\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-g7wfh" Mar 13 01:19:05.326065 master-0 kubenswrapper[19170]: I0313 01:19:05.326019 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsdm\" (UniqueName: \"kubernetes.io/projected/1c24e17c-8bd9-4c23-9876-6f31c9da5cd1-kube-api-access-dqsdm\") pod \"machine-api-operator-84bf6db4f9-zt229\" (UID: \"1c24e17c-8bd9-4c23-9876-6f31c9da5cd1\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-zt229" Mar 13 01:19:05.336923 master-0 kubenswrapper[19170]: I0313 01:19:05.336866 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftn5x\" (UniqueName: \"kubernetes.io/projected/0671fdd0-b358-40f9-ae49-2c5a9004edb3-kube-api-access-ftn5x\") pod \"router-default-79f8cd6fdd-cnrhm\" (UID: \"0671fdd0-b358-40f9-ae49-2c5a9004edb3\") " pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:05.365134 master-0 kubenswrapper[19170]: I0313 01:19:05.365026 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvtc\" (UniqueName: \"kubernetes.io/projected/6a5ab1d5-dabd-45e7-a688-71a282f61e67-kube-api-access-lpvtc\") pod \"tuned-9vzj5\" (UID: \"6a5ab1d5-dabd-45e7-a688-71a282f61e67\") " pod="openshift-cluster-node-tuning-operator/tuned-9vzj5" Mar 13 01:19:05.389661 master-0 kubenswrapper[19170]: I0313 01:19:05.389590 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7wj9\" (UniqueName: \"kubernetes.io/projected/d81bcb58-efe3-4577-8e88-67f92c645f6f-kube-api-access-k7wj9\") pod \"node-resolver-lw6xm\" (UID: \"d81bcb58-efe3-4577-8e88-67f92c645f6f\") " pod="openshift-dns/node-resolver-lw6xm" Mar 13 01:19:05.397709 master-0 kubenswrapper[19170]: I0313 01:19:05.397663 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-898lt\" (UniqueName: \"kubernetes.io/projected/33dfdc31-54a4-4249-99ae-a15180514659-kube-api-access-898lt\") pod \"machine-approver-754bdc9f9d-knlw8\" (UID: \"33dfdc31-54a4-4249-99ae-a15180514659\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" Mar 13 01:19:05.430278 master-0 kubenswrapper[19170]: I0313 01:19:05.430226 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5hs\" (UniqueName: \"kubernetes.io/projected/ac2a4c90-32db-4464-8c47-acbcafbcd5d0-kube-api-access-sw5hs\") pod \"cni-sysctl-allowlist-ds-hdx2d\" (UID: \"ac2a4c90-32db-4464-8c47-acbcafbcd5d0\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:05.447698 master-0 kubenswrapper[19170]: E0313 01:19:05.447487 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:05.447698 master-0 kubenswrapper[19170]: E0313 01:19:05.447545 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:05.447698 master-0 kubenswrapper[19170]: E0313 01:19:05.447683 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:05.947619653 +0000 UTC m=+6.755740653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:05.455747 master-0 kubenswrapper[19170]: I0313 01:19:05.455489 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:05.885200 master-0 kubenswrapper[19170]: I0313 01:19:05.885132 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:05.885200 master-0 kubenswrapper[19170]: I0313 01:19:05.885171 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:05.968908 master-0 kubenswrapper[19170]: I0313 01:19:05.968864 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:05.969703 master-0 kubenswrapper[19170]: E0313 01:19:05.969020 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:05.969703 master-0 kubenswrapper[19170]: E0313 01:19:05.969232 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:05.969703 master-0 kubenswrapper[19170]: E0313 01:19:05.969290 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:06.969268288 +0000 UTC m=+7.777389258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:06.458780 master-0 kubenswrapper[19170]: I0313 01:19:06.458711 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:06.476261 master-0 kubenswrapper[19170]: I0313 01:19:06.476214 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:19:06.476393 master-0 kubenswrapper[19170]: I0313 01:19:06.476372 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:06.477437 master-0 kubenswrapper[19170]: I0313 01:19:06.477400 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xs8pt" Mar 13 01:19:06.650937 master-0 kubenswrapper[19170]: I0313 01:19:06.650869 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:06.846891 master-0 kubenswrapper[19170]: I0313 01:19:06.846846 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:06.852249 master-0 kubenswrapper[19170]: I0313 01:19:06.852212 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:06.893275 master-0 kubenswrapper[19170]: I0313 01:19:06.892491 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" event={"ID":"9db888f0-51b6-43cf-8337-69d2d5cc2b0a","Type":"ContainerStarted","Data":"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb"} Mar 13 01:19:06.893275 master-0 kubenswrapper[19170]: I0313 01:19:06.892618 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:06.893275 master-0 kubenswrapper[19170]: I0313 01:19:06.892659 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:06.893275 master-0 kubenswrapper[19170]: I0313 01:19:06.892709 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:06.903159 master-0 kubenswrapper[19170]: I0313 01:19:06.903005 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=16.902975764 podStartE2EDuration="16.902975764s" podCreationTimestamp="2026-03-13 01:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:19:06.901345131 +0000 UTC m=+7.709466111" watchObservedRunningTime="2026-03-13 01:19:06.902975764 +0000 UTC m=+7.711096734" Mar 13 01:19:06.995352 master-0 kubenswrapper[19170]: I0313 01:19:06.992700 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:06.997978 master-0 kubenswrapper[19170]: E0313 01:19:06.997858 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:06.997978 master-0 kubenswrapper[19170]: E0313 01:19:06.997888 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:06.997978 master-0 kubenswrapper[19170]: E0313 01:19:06.997932 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:08.99791388 +0000 UTC m=+9.806034840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:07.230307 master-0 kubenswrapper[19170]: I0313 01:19:07.230254 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:07.234034 master-0 kubenswrapper[19170]: I0313 01:19:07.233995 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-cnrhm" Mar 13 01:19:07.319181 master-0 kubenswrapper[19170]: I0313 01:19:07.319067 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:07.319380 master-0 kubenswrapper[19170]: I0313 01:19:07.319200 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:07.321656 master-0 kubenswrapper[19170]: I0313 01:19:07.321590 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:19:07.742096 master-0 kubenswrapper[19170]: I0313 01:19:07.742039 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:07.742840 master-0 kubenswrapper[19170]: I0313 01:19:07.742214 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:07.748010 master-0 kubenswrapper[19170]: I0313 01:19:07.747976 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:08.002978 master-0 kubenswrapper[19170]: I0313 01:19:08.002822 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:08.003179 master-0 kubenswrapper[19170]: I0313 01:19:08.002991 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:08.006875 master-0 kubenswrapper[19170]: I0313 01:19:08.006843 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:08.194859 master-0 kubenswrapper[19170]: I0313 01:19:08.194810 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:08.266287 master-0 kubenswrapper[19170]: I0313 01:19:08.266114 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:08.266687 master-0 kubenswrapper[19170]: I0313 01:19:08.266618 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:08.277602 master-0 kubenswrapper[19170]: I0313 01:19:08.277566 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-8l7kq" Mar 13 01:19:09.026711 master-0 kubenswrapper[19170]: I0313 01:19:09.025492 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:09.026711 master-0 kubenswrapper[19170]: E0313 01:19:09.026138 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:09.026711 master-0 kubenswrapper[19170]: E0313 01:19:09.026171 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:09.026711 master-0 kubenswrapper[19170]: E0313 01:19:09.026244 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:13.026218784 +0000 UTC m=+13.834339844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:09.145238 master-0 kubenswrapper[19170]: I0313 01:19:09.145194 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:09.233352 master-0 kubenswrapper[19170]: I0313 01:19:09.233293 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:09.233618 master-0 kubenswrapper[19170]: I0313 01:19:09.233457 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:09.242901 master-0 kubenswrapper[19170]: I0313 01:19:09.242847 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-68f6795949-v9w8g" Mar 13 01:19:09.314977 master-0 kubenswrapper[19170]: I0313 01:19:09.314867 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:19:09.315144 master-0 kubenswrapper[19170]: I0313 01:19:09.315078 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de" gracePeriod=5 Mar 13 01:19:09.481442 master-0 kubenswrapper[19170]: I0313 01:19:09.481397 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 01:19:09.498863 master-0 kubenswrapper[19170]: I0313 01:19:09.498816 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 01:19:09.596793 master-0 kubenswrapper[19170]: I0313 01:19:09.596675 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:09.654271 master-0 kubenswrapper[19170]: I0313 01:19:09.654227 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:09.860976 master-0 kubenswrapper[19170]: I0313 01:19:09.860856 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:09.865369 master-0 kubenswrapper[19170]: I0313 01:19:09.865317 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-69c74d8d69-jpj8z" Mar 13 01:19:09.939366 master-0 kubenswrapper[19170]: I0313 01:19:09.939270 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 01:19:09.987663 master-0 kubenswrapper[19170]: I0313 01:19:09.983705 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:10.045817 master-0 kubenswrapper[19170]: I0313 01:19:10.045783 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:10.160772 master-0 kubenswrapper[19170]: I0313 01:19:10.160672 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:10.164706 master-0 kubenswrapper[19170]: I0313 01:19:10.164668 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-78885b775b-jrrjv" Mar 13 01:19:10.256688 master-0 kubenswrapper[19170]: I0313 01:19:10.255735 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:10.456460 master-0 kubenswrapper[19170]: I0313 01:19:10.456394 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:10.462407 master-0 kubenswrapper[19170]: I0313 01:19:10.462352 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:10.553286 master-0 kubenswrapper[19170]: I0313 01:19:10.553239 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:10.553527 master-0 kubenswrapper[19170]: I0313 01:19:10.553504 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:10.553584 master-0 kubenswrapper[19170]: I0313 01:19:10.553540 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:10.564811 master-0 kubenswrapper[19170]: I0313 01:19:10.563285 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:10.564811 master-0 kubenswrapper[19170]: I0313 01:19:10.563496 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:10.568080 master-0 kubenswrapper[19170]: I0313 01:19:10.567754 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-nrzpj" Mar 13 01:19:10.590443 master-0 kubenswrapper[19170]: I0313 01:19:10.590403 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:10.850449 master-0 kubenswrapper[19170]: I0313 01:19:10.850340 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:10.850449 master-0 kubenswrapper[19170]: I0313 01:19:10.850441 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:10.867581 master-0 kubenswrapper[19170]: I0313 01:19:10.867531 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-hdx2d" Mar 13 01:19:10.919476 master-0 kubenswrapper[19170]: I0313 01:19:10.919420 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:10.925376 master-0 kubenswrapper[19170]: I0313 01:19:10.925340 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:10.997403 master-0 kubenswrapper[19170]: I0313 01:19:10.997341 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bbptx" Mar 13 01:19:11.529021 master-0 kubenswrapper[19170]: I0313 01:19:11.528951 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:11.602092 master-0 kubenswrapper[19170]: I0313 01:19:11.602048 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v56ct" Mar 13 01:19:11.669802 master-0 kubenswrapper[19170]: I0313 01:19:11.669732 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:11.670053 master-0 kubenswrapper[19170]: I0313 01:19:11.669943 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:11.679482 master-0 kubenswrapper[19170]: I0313 01:19:11.679429 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-h46pz" Mar 13 01:19:12.999973 master-0 kubenswrapper[19170]: I0313 01:19:12.999911 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:13.051882 master-0 kubenswrapper[19170]: I0313 01:19:13.051818 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:13.088836 master-0 kubenswrapper[19170]: I0313 01:19:13.088776 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:13.089138 master-0 kubenswrapper[19170]: E0313 01:19:13.088917 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:13.089138 master-0 kubenswrapper[19170]: E0313 01:19:13.088932 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:13.089138 master-0 kubenswrapper[19170]: E0313 01:19:13.088967 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:21.088953702 +0000 UTC m=+21.897074662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:13.186595 master-0 kubenswrapper[19170]: I0313 01:19:13.186528 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:13.186988 master-0 kubenswrapper[19170]: I0313 01:19:13.186954 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:13.190204 master-0 kubenswrapper[19170]: I0313 01:19:13.190151 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:19:13.632784 master-0 kubenswrapper[19170]: I0313 01:19:13.632710 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:13.633064 master-0 kubenswrapper[19170]: I0313 01:19:13.632923 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:13.676048 master-0 kubenswrapper[19170]: I0313 01:19:13.675985 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k52lh" Mar 13 01:19:13.795281 master-0 kubenswrapper[19170]: I0313 01:19:13.795239 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:13.975715 master-0 kubenswrapper[19170]: I0313 01:19:13.975680 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zvz2" Mar 13 01:19:14.661287 master-0 kubenswrapper[19170]: I0313 01:19:14.659975 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:14.661287 master-0 kubenswrapper[19170]: I0313 01:19:14.660196 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:19:14.662994 master-0 kubenswrapper[19170]: I0313 01:19:14.662593 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:19:14.904699 master-0 kubenswrapper[19170]: I0313 01:19:14.904435 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 13 01:19:14.904699 master-0 kubenswrapper[19170]: I0313 01:19:14.904530 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:14.951787 master-0 kubenswrapper[19170]: I0313 01:19:14.951532 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 13 01:19:14.951787 master-0 kubenswrapper[19170]: I0313 01:19:14.951663 19170 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de" exitCode=137 Mar 13 01:19:14.951787 master-0 kubenswrapper[19170]: I0313 01:19:14.951756 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:19:14.952126 master-0 kubenswrapper[19170]: I0313 01:19:14.951788 19170 scope.go:117] "RemoveContainer" containerID="488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de" Mar 13 01:19:14.972226 master-0 kubenswrapper[19170]: I0313 01:19:14.972196 19170 scope.go:117] "RemoveContainer" containerID="488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de" Mar 13 01:19:14.973148 master-0 kubenswrapper[19170]: E0313 01:19:14.972677 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de\": container with ID starting with 488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de not found: ID does not exist" containerID="488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de" Mar 13 01:19:14.973148 master-0 kubenswrapper[19170]: I0313 01:19:14.972745 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de"} err="failed to get container status \"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de\": rpc error: code = NotFound desc = could not find container \"488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de\": container with ID starting with 488ba7a7a8dd3c0fa5c308fff5e919aa8bf6afd116d2c4ea7906bc2734b837de not found: ID does not exist" Mar 13 01:19:15.016468 master-0 kubenswrapper[19170]: I0313 01:19:15.016415 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 13 01:19:15.016738 master-0 kubenswrapper[19170]: I0313 01:19:15.016564 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 13 01:19:15.016738 master-0 kubenswrapper[19170]: I0313 01:19:15.016597 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 13 01:19:15.016738 master-0 kubenswrapper[19170]: I0313 01:19:15.016597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:15.016738 master-0 kubenswrapper[19170]: I0313 01:19:15.016665 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 13 01:19:15.016738 master-0 kubenswrapper[19170]: I0313 01:19:15.016729 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 13 01:19:15.017108 master-0 kubenswrapper[19170]: I0313 01:19:15.017075 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:15.017475 master-0 kubenswrapper[19170]: I0313 01:19:15.017434 19170 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:15.017475 master-0 kubenswrapper[19170]: I0313 01:19:15.017472 19170 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:15.017755 master-0 kubenswrapper[19170]: I0313 01:19:15.017713 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:15.017858 master-0 kubenswrapper[19170]: I0313 01:19:15.017768 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:15.025776 master-0 kubenswrapper[19170]: I0313 01:19:15.025725 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:19:15.118332 master-0 kubenswrapper[19170]: I0313 01:19:15.118274 19170 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:15.118332 master-0 kubenswrapper[19170]: I0313 01:19:15.118317 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:15.118332 master-0 kubenswrapper[19170]: I0313 01:19:15.118331 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:15.140909 master-0 kubenswrapper[19170]: I0313 01:19:15.140870 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:15.188609 master-0 kubenswrapper[19170]: I0313 01:19:15.188538 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:15.307874 master-0 kubenswrapper[19170]: I0313 01:19:15.307804 19170 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="ed57b3e5-06b6-4a5d-b5c6-e021c84904f3" Mar 13 01:19:15.437243 master-0 kubenswrapper[19170]: I0313 01:19:15.437169 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 13 01:19:15.437787 master-0 kubenswrapper[19170]: I0313 01:19:15.437679 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 13 01:19:15.508199 master-0 kubenswrapper[19170]: I0313 01:19:15.490087 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:19:15.508199 master-0 kubenswrapper[19170]: I0313 01:19:15.490125 19170 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="ed57b3e5-06b6-4a5d-b5c6-e021c84904f3" Mar 13 01:19:15.508199 master-0 kubenswrapper[19170]: I0313 01:19:15.501074 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:19:15.508199 master-0 kubenswrapper[19170]: I0313 01:19:15.501160 19170 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="ed57b3e5-06b6-4a5d-b5c6-e021c84904f3" Mar 13 01:19:16.029611 master-0 kubenswrapper[19170]: I0313 01:19:16.029550 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z254g" Mar 13 01:19:17.656654 master-0 kubenswrapper[19170]: I0313 01:19:17.656580 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-c7cfk"] Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.656906 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.656926 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.656958 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.656969 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.656984 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.656992 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657009 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690f916b-6f87-42d9-8168-392a9177bee9" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657017 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="690f916b-6f87-42d9-8168-392a9177bee9" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657035 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657042 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657052 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657059 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657069 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657077 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657091 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657098 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657110 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657117 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657153 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657162 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657176 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657184 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657200 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657208 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: E0313 01:19:17.657220 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:19:17.657204 master-0 kubenswrapper[19170]: I0313 01:19:17.657231 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657359 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="728435e4-9fdb-4fea-9f5b-eb5ff5444da0" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657375 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9c1c81-eae9-4481-9870-b598deb1dcac" containerName="pruner" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657387 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="kube-rbac-proxy" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657399 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="abf2ead5-b97d-4160-8120-28cb8a3d843e" containerName="assisted-installer-controller" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657413 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b2d6ee-3ae7-444b-b327-f024a8a06ab7" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657427 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="47806631-9d60-4658-832d-f160f93f42ea" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657441 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e096ea-ca9d-477b-b0aa-1d10244d51d9" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657450 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd94289-7109-4419-9a51-bd289082b9f5" containerName="multus-admission-controller" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657461 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3a64f4-e94f-4916-8c91-a255d987735d" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657472 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657483 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657496 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="690f916b-6f87-42d9-8168-392a9177bee9" containerName="installer" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657508 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 01:19:17.658007 master-0 kubenswrapper[19170]: I0313 01:19:17.657980 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.661961 master-0 kubenswrapper[19170]: I0313 01:19:17.661915 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 01:19:17.662092 master-0 kubenswrapper[19170]: I0313 01:19:17.661996 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 01:19:17.662092 master-0 kubenswrapper[19170]: I0313 01:19:17.662003 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-m57n6" Mar 13 01:19:17.662092 master-0 kubenswrapper[19170]: I0313 01:19:17.662065 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 01:19:17.662092 master-0 kubenswrapper[19170]: I0313 01:19:17.662072 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 01:19:17.662313 master-0 kubenswrapper[19170]: I0313 01:19:17.662289 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 01:19:17.681699 master-0 kubenswrapper[19170]: I0313 01:19:17.681659 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-c7cfk"] Mar 13 01:19:17.760308 master-0 kubenswrapper[19170]: I0313 01:19:17.760260 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-config\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.760543 master-0 kubenswrapper[19170]: I0313 01:19:17.760501 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.760619 master-0 kubenswrapper[19170]: I0313 01:19:17.760582 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-serving-cert\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.760679 master-0 kubenswrapper[19170]: I0313 01:19:17.760614 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qk8v\" (UniqueName: \"kubernetes.io/projected/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-kube-api-access-9qk8v\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.862273 master-0 kubenswrapper[19170]: I0313 01:19:17.862216 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.862501 master-0 kubenswrapper[19170]: E0313 01:19:17.862428 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca podName:7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:18.362405774 +0000 UTC m=+19.170526734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca") pod "console-operator-6c7fb6b958-c7cfk" (UID: "7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1") : configmap references non-existent config key: ca-bundle.crt Mar 13 01:19:17.862501 master-0 kubenswrapper[19170]: I0313 01:19:17.862475 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-serving-cert\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.862587 master-0 kubenswrapper[19170]: I0313 01:19:17.862505 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qk8v\" (UniqueName: \"kubernetes.io/projected/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-kube-api-access-9qk8v\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.862587 master-0 kubenswrapper[19170]: I0313 01:19:17.862562 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-config\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.863221 master-0 kubenswrapper[19170]: I0313 01:19:17.863176 19170 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 01:19:17.863567 master-0 kubenswrapper[19170]: I0313 01:19:17.863522 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-config\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.867469 master-0 kubenswrapper[19170]: I0313 01:19:17.867415 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-serving-cert\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:17.894279 master-0 kubenswrapper[19170]: I0313 01:19:17.894204 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qk8v\" (UniqueName: \"kubernetes.io/projected/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-kube-api-access-9qk8v\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:18.371377 master-0 kubenswrapper[19170]: I0313 01:19:18.371302 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:18.373665 master-0 kubenswrapper[19170]: I0313 01:19:18.373587 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1-trusted-ca\") pod \"console-operator-6c7fb6b958-c7cfk\" (UID: \"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1\") " pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:18.566327 master-0 kubenswrapper[19170]: I0313 01:19:18.566243 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 01:19:18.567447 master-0 kubenswrapper[19170]: I0313 01:19:18.567410 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.570249 master-0 kubenswrapper[19170]: I0313 01:19:18.570219 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:19:18.570646 master-0 kubenswrapper[19170]: I0313 01:19:18.570595 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:19:18.579809 master-0 kubenswrapper[19170]: I0313 01:19:18.579751 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:18.582122 master-0 kubenswrapper[19170]: I0313 01:19:18.581583 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 01:19:18.675209 master-0 kubenswrapper[19170]: I0313 01:19:18.675138 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.675777 master-0 kubenswrapper[19170]: I0313 01:19:18.675256 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.675777 master-0 kubenswrapper[19170]: I0313 01:19:18.675392 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.791233 master-0 kubenswrapper[19170]: I0313 01:19:18.790452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.791233 master-0 kubenswrapper[19170]: I0313 01:19:18.790560 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.791233 master-0 kubenswrapper[19170]: I0313 01:19:18.790669 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.791233 master-0 kubenswrapper[19170]: I0313 01:19:18.790671 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.791233 master-0 kubenswrapper[19170]: I0313 01:19:18.790790 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.815089 master-0 kubenswrapper[19170]: I0313 01:19:18.815019 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.877700 master-0 kubenswrapper[19170]: I0313 01:19:18.868964 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kjk8n"] Mar 13 01:19:18.877700 master-0 kubenswrapper[19170]: I0313 01:19:18.869850 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:18.877700 master-0 kubenswrapper[19170]: I0313 01:19:18.874095 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-rq7qf" Mar 13 01:19:18.877700 master-0 kubenswrapper[19170]: I0313 01:19:18.874399 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 01:19:18.877700 master-0 kubenswrapper[19170]: I0313 01:19:18.874618 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 01:19:18.880780 master-0 kubenswrapper[19170]: I0313 01:19:18.880552 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 01:19:18.895194 master-0 kubenswrapper[19170]: I0313 01:19:18.895149 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kjk8n"] Mar 13 01:19:18.904445 master-0 kubenswrapper[19170]: I0313 01:19:18.904115 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:19:18.993939 master-0 kubenswrapper[19170]: I0313 01:19:18.993519 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q96l\" (UniqueName: \"kubernetes.io/projected/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-kube-api-access-4q96l\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:18.993939 master-0 kubenswrapper[19170]: I0313 01:19:18.993775 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-cert\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.095914 master-0 kubenswrapper[19170]: I0313 01:19:19.095783 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q96l\" (UniqueName: \"kubernetes.io/projected/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-kube-api-access-4q96l\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.096173 master-0 kubenswrapper[19170]: I0313 01:19:19.096019 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-cert\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.100724 master-0 kubenswrapper[19170]: I0313 01:19:19.100458 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-cert\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.107531 master-0 kubenswrapper[19170]: I0313 01:19:19.107480 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-c7cfk"] Mar 13 01:19:19.127699 master-0 kubenswrapper[19170]: I0313 01:19:19.127611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q96l\" (UniqueName: \"kubernetes.io/projected/7cb8fbc0-2b68-4823-ba2b-ddfa60d41502-kube-api-access-4q96l\") pod \"ingress-canary-kjk8n\" (UID: \"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502\") " pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.215433 master-0 kubenswrapper[19170]: I0313 01:19:19.215288 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kjk8n" Mar 13 01:19:19.324688 master-0 kubenswrapper[19170]: I0313 01:19:19.324502 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 01:19:19.361148 master-0 kubenswrapper[19170]: I0313 01:19:19.361069 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:19.698036 master-0 kubenswrapper[19170]: I0313 01:19:19.697981 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kjk8n"] Mar 13 01:19:19.705048 master-0 kubenswrapper[19170]: W0313 01:19:19.704996 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb8fbc0_2b68_4823_ba2b_ddfa60d41502.slice/crio-7493be61d397515ff66e6c12268e69bbb3b87e14954a233f90153fb50d315dc8 WatchSource:0}: Error finding container 7493be61d397515ff66e6c12268e69bbb3b87e14954a233f90153fb50d315dc8: Status 404 returned error can't find the container with id 7493be61d397515ff66e6c12268e69bbb3b87e14954a233f90153fb50d315dc8 Mar 13 01:19:20.000137 master-0 kubenswrapper[19170]: I0313 01:19:20.000068 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kjk8n" event={"ID":"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502","Type":"ContainerStarted","Data":"f3c5ad7576026a80f6b31d548b88775fc0be041301f5033c86f47aeea969c8f6"} Mar 13 01:19:20.000137 master-0 kubenswrapper[19170]: I0313 01:19:20.000121 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kjk8n" event={"ID":"7cb8fbc0-2b68-4823-ba2b-ddfa60d41502","Type":"ContainerStarted","Data":"7493be61d397515ff66e6c12268e69bbb3b87e14954a233f90153fb50d315dc8"} Mar 13 01:19:20.001763 master-0 kubenswrapper[19170]: I0313 01:19:20.001711 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"bf1a1318-e714-42a4-82b6-17265862b2a5","Type":"ContainerStarted","Data":"7f31e173b6981d921b01c4ef90be95351f61662568e535b5423ed55139bc69e0"} Mar 13 01:19:20.001763 master-0 kubenswrapper[19170]: I0313 01:19:20.001760 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"bf1a1318-e714-42a4-82b6-17265862b2a5","Type":"ContainerStarted","Data":"18d390ccbea0e205cf30f6a90982255d83fa8f8518bae570e74823ab95c85e9d"} Mar 13 01:19:20.002814 master-0 kubenswrapper[19170]: I0313 01:19:20.002781 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" event={"ID":"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1","Type":"ContainerStarted","Data":"ab7bb534a612ab0c649af71f9d2d6e57164b39b9b760491af99aa67ac5392e80"} Mar 13 01:19:20.018777 master-0 kubenswrapper[19170]: I0313 01:19:20.018710 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kjk8n" podStartSLOduration=2.018689453 podStartE2EDuration="2.018689453s" podCreationTimestamp="2026-03-13 01:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:19:20.017153322 +0000 UTC m=+20.825274292" watchObservedRunningTime="2026-03-13 01:19:20.018689453 +0000 UTC m=+20.826810413" Mar 13 01:19:20.043742 master-0 kubenswrapper[19170]: I0313 01:19:20.041822 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.04180198 podStartE2EDuration="2.04180198s" podCreationTimestamp="2026-03-13 01:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:19:20.041085541 +0000 UTC m=+20.849206511" watchObservedRunningTime="2026-03-13 01:19:20.04180198 +0000 UTC m=+20.849922950" Mar 13 01:19:21.145500 master-0 kubenswrapper[19170]: I0313 01:19:21.145324 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:21.147616 master-0 kubenswrapper[19170]: E0313 01:19:21.145856 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:21.147616 master-0 kubenswrapper[19170]: E0313 01:19:21.145875 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:21.147616 master-0 kubenswrapper[19170]: E0313 01:19:21.145921 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:19:37.145908846 +0000 UTC m=+37.954029806 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:21.768207 master-0 kubenswrapper[19170]: I0313 01:19:21.768092 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-c6d678564-c872b"] Mar 13 01:19:21.771890 master-0 kubenswrapper[19170]: I0313 01:19:21.771790 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:21.777471 master-0 kubenswrapper[19170]: I0313 01:19:21.776527 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 01:19:21.778408 master-0 kubenswrapper[19170]: I0313 01:19:21.778359 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-lgs6j" Mar 13 01:19:21.785271 master-0 kubenswrapper[19170]: I0313 01:19:21.785217 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-c6d678564-c872b"] Mar 13 01:19:21.865137 master-0 kubenswrapper[19170]: I0313 01:19:21.865070 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19031e51-1c3e-412a-a914-ffdfe394103c-monitoring-plugin-cert\") pod \"monitoring-plugin-c6d678564-c872b\" (UID: \"19031e51-1c3e-412a-a914-ffdfe394103c\") " pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:21.866606 master-0 kubenswrapper[19170]: I0313 01:19:21.866559 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rxv8s"] Mar 13 01:19:21.867459 master-0 kubenswrapper[19170]: I0313 01:19:21.867435 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:21.871340 master-0 kubenswrapper[19170]: I0313 01:19:21.870697 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-gtb4f" Mar 13 01:19:21.871340 master-0 kubenswrapper[19170]: I0313 01:19:21.871109 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 01:19:21.966575 master-0 kubenswrapper[19170]: I0313 01:19:21.966496 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19031e51-1c3e-412a-a914-ffdfe394103c-monitoring-plugin-cert\") pod \"monitoring-plugin-c6d678564-c872b\" (UID: \"19031e51-1c3e-412a-a914-ffdfe394103c\") " pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:21.966575 master-0 kubenswrapper[19170]: I0313 01:19:21.966566 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-host\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:21.966973 master-0 kubenswrapper[19170]: I0313 01:19:21.966714 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqnsb\" (UniqueName: \"kubernetes.io/projected/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-kube-api-access-rqnsb\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:21.966973 master-0 kubenswrapper[19170]: I0313 01:19:21.966780 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-serviceca\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:21.971281 master-0 kubenswrapper[19170]: I0313 01:19:21.971240 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19031e51-1c3e-412a-a914-ffdfe394103c-monitoring-plugin-cert\") pod \"monitoring-plugin-c6d678564-c872b\" (UID: \"19031e51-1c3e-412a-a914-ffdfe394103c\") " pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:22.020653 master-0 kubenswrapper[19170]: I0313 01:19:22.020460 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6c7fb6b958-c7cfk_7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1/console-operator/0.log" Mar 13 01:19:22.020653 master-0 kubenswrapper[19170]: I0313 01:19:22.020535 19170 generic.go:334] "Generic (PLEG): container finished" podID="7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1" containerID="00f4c9d2a6e681e60f25a703330f19f590140797280de8a2723dba53d7f150de" exitCode=255 Mar 13 01:19:22.020653 master-0 kubenswrapper[19170]: I0313 01:19:22.020578 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" event={"ID":"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1","Type":"ContainerDied","Data":"00f4c9d2a6e681e60f25a703330f19f590140797280de8a2723dba53d7f150de"} Mar 13 01:19:22.021268 master-0 kubenswrapper[19170]: I0313 01:19:22.021215 19170 scope.go:117] "RemoveContainer" containerID="00f4c9d2a6e681e60f25a703330f19f590140797280de8a2723dba53d7f150de" Mar 13 01:19:22.068386 master-0 kubenswrapper[19170]: I0313 01:19:22.068302 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqnsb\" (UniqueName: \"kubernetes.io/projected/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-kube-api-access-rqnsb\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.068629 master-0 kubenswrapper[19170]: I0313 01:19:22.068464 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-serviceca\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.068629 master-0 kubenswrapper[19170]: I0313 01:19:22.068523 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-host\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.068629 master-0 kubenswrapper[19170]: I0313 01:19:22.068611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-host\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.069429 master-0 kubenswrapper[19170]: I0313 01:19:22.069363 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-serviceca\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.090835 master-0 kubenswrapper[19170]: I0313 01:19:22.090780 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqnsb\" (UniqueName: \"kubernetes.io/projected/cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a-kube-api-access-rqnsb\") pod \"node-ca-rxv8s\" (UID: \"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a\") " pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.190457 master-0 kubenswrapper[19170]: I0313 01:19:22.190337 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:22.204363 master-0 kubenswrapper[19170]: I0313 01:19:22.204298 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rxv8s" Mar 13 01:19:22.230298 master-0 kubenswrapper[19170]: W0313 01:19:22.230248 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfcd9718_ad50_4c1d_819e_dc7ff2eaa03a.slice/crio-c2c54620f822e51d78ed0403dbeda2304cb309b9c6f3b0ca07697febaa2c288f WatchSource:0}: Error finding container c2c54620f822e51d78ed0403dbeda2304cb309b9c6f3b0ca07697febaa2c288f: Status 404 returned error can't find the container with id c2c54620f822e51d78ed0403dbeda2304cb309b9c6f3b0ca07697febaa2c288f Mar 13 01:19:22.624262 master-0 kubenswrapper[19170]: I0313 01:19:22.624156 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-c6d678564-c872b"] Mar 13 01:19:22.631890 master-0 kubenswrapper[19170]: W0313 01:19:22.631852 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19031e51_1c3e_412a_a914_ffdfe394103c.slice/crio-66f82131fe88b4742eb54248dfc258dfaa8498ad5e8a0cae5a4cd6f7dbe24428 WatchSource:0}: Error finding container 66f82131fe88b4742eb54248dfc258dfaa8498ad5e8a0cae5a4cd6f7dbe24428: Status 404 returned error can't find the container with id 66f82131fe88b4742eb54248dfc258dfaa8498ad5e8a0cae5a4cd6f7dbe24428 Mar 13 01:19:22.872120 master-0 kubenswrapper[19170]: I0313 01:19:22.872081 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-5k2pr"] Mar 13 01:19:22.873047 master-0 kubenswrapper[19170]: I0313 01:19:22.873032 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:19:22.875360 master-0 kubenswrapper[19170]: I0313 01:19:22.875305 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 01:19:22.875753 master-0 kubenswrapper[19170]: I0313 01:19:22.875739 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 01:19:22.875915 master-0 kubenswrapper[19170]: I0313 01:19:22.875750 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-z62g7" Mar 13 01:19:22.890529 master-0 kubenswrapper[19170]: I0313 01:19:22.890486 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-5k2pr"] Mar 13 01:19:22.980485 master-0 kubenswrapper[19170]: I0313 01:19:22.980414 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9m85\" (UniqueName: \"kubernetes.io/projected/81d4d1af-7da3-4848-b566-39741e905928-kube-api-access-k9m85\") pod \"downloads-84f57b9877-5k2pr\" (UID: \"81d4d1af-7da3-4848-b566-39741e905928\") " pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:19:23.038780 master-0 kubenswrapper[19170]: I0313 01:19:23.038740 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" event={"ID":"19031e51-1c3e-412a-a914-ffdfe394103c","Type":"ContainerStarted","Data":"66f82131fe88b4742eb54248dfc258dfaa8498ad5e8a0cae5a4cd6f7dbe24428"} Mar 13 01:19:23.043430 master-0 kubenswrapper[19170]: I0313 01:19:23.043407 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6c7fb6b958-c7cfk_7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1/console-operator/0.log" Mar 13 01:19:23.043684 master-0 kubenswrapper[19170]: I0313 01:19:23.043665 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" event={"ID":"7d8ea986-6e18-4d6c-86b3-b6eaa5c9d4c1","Type":"ContainerStarted","Data":"60d2ad0ded7239363528f0ed2674790c4da71261c7a946bd6c9082ff20e0dfc2"} Mar 13 01:19:23.044098 master-0 kubenswrapper[19170]: I0313 01:19:23.044084 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:23.055262 master-0 kubenswrapper[19170]: I0313 01:19:23.055226 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxv8s" event={"ID":"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a","Type":"ContainerStarted","Data":"c2c54620f822e51d78ed0403dbeda2304cb309b9c6f3b0ca07697febaa2c288f"} Mar 13 01:19:23.055683 master-0 kubenswrapper[19170]: I0313 01:19:23.055669 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" Mar 13 01:19:23.071965 master-0 kubenswrapper[19170]: I0313 01:19:23.071911 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-c7cfk" podStartSLOduration=3.71390615 podStartE2EDuration="6.07189359s" podCreationTimestamp="2026-03-13 01:19:17 +0000 UTC" firstStartedPulling="2026-03-13 01:19:19.127705079 +0000 UTC m=+19.935826079" lastFinishedPulling="2026-03-13 01:19:21.485692559 +0000 UTC m=+22.293813519" observedRunningTime="2026-03-13 01:19:23.071322655 +0000 UTC m=+23.879443615" watchObservedRunningTime="2026-03-13 01:19:23.07189359 +0000 UTC m=+23.880014550" Mar 13 01:19:23.082313 master-0 kubenswrapper[19170]: I0313 01:19:23.082284 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9m85\" (UniqueName: \"kubernetes.io/projected/81d4d1af-7da3-4848-b566-39741e905928-kube-api-access-k9m85\") pod \"downloads-84f57b9877-5k2pr\" (UID: \"81d4d1af-7da3-4848-b566-39741e905928\") " pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:19:23.105970 master-0 kubenswrapper[19170]: I0313 01:19:23.105943 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9m85\" (UniqueName: \"kubernetes.io/projected/81d4d1af-7da3-4848-b566-39741e905928-kube-api-access-k9m85\") pod \"downloads-84f57b9877-5k2pr\" (UID: \"81d4d1af-7da3-4848-b566-39741e905928\") " pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:19:23.190499 master-0 kubenswrapper[19170]: I0313 01:19:23.190456 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:19:23.588933 master-0 kubenswrapper[19170]: I0313 01:19:23.588818 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-5k2pr"] Mar 13 01:19:24.071313 master-0 kubenswrapper[19170]: I0313 01:19:24.071239 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-5k2pr" event={"ID":"81d4d1af-7da3-4848-b566-39741e905928","Type":"ContainerStarted","Data":"a1cb642043c7fa405f3da24d356579a6e9f1b4e7fd31ec3154b270a922678a82"} Mar 13 01:19:26.089670 master-0 kubenswrapper[19170]: I0313 01:19:26.089192 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" event={"ID":"19031e51-1c3e-412a-a914-ffdfe394103c","Type":"ContainerStarted","Data":"1a542903f25dd012f76a18c9e62bff08a67418edc0f3ef47b2f93fa36aa6c602"} Mar 13 01:19:26.089670 master-0 kubenswrapper[19170]: I0313 01:19:26.089442 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:26.092185 master-0 kubenswrapper[19170]: I0313 01:19:26.092119 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rxv8s" event={"ID":"cfcd9718-ad50-4c1d-819e-dc7ff2eaa03a","Type":"ContainerStarted","Data":"5698d4f245dbfa4d293c5706adcd1c3eb1734efd3aa02778b4a87e917eb15f10"} Mar 13 01:19:26.098153 master-0 kubenswrapper[19170]: I0313 01:19:26.098084 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" Mar 13 01:19:26.110711 master-0 kubenswrapper[19170]: I0313 01:19:26.110598 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-c6d678564-c872b" podStartSLOduration=1.978209288 podStartE2EDuration="5.110581497s" podCreationTimestamp="2026-03-13 01:19:21 +0000 UTC" firstStartedPulling="2026-03-13 01:19:22.634102811 +0000 UTC m=+23.442223781" lastFinishedPulling="2026-03-13 01:19:25.76647503 +0000 UTC m=+26.574595990" observedRunningTime="2026-03-13 01:19:26.107588708 +0000 UTC m=+26.915709678" watchObservedRunningTime="2026-03-13 01:19:26.110581497 +0000 UTC m=+26.918702457" Mar 13 01:19:29.448986 master-0 kubenswrapper[19170]: I0313 01:19:29.448669 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rxv8s" podStartSLOduration=4.9107150520000005 podStartE2EDuration="8.448607602s" podCreationTimestamp="2026-03-13 01:19:21 +0000 UTC" firstStartedPulling="2026-03-13 01:19:22.233601812 +0000 UTC m=+23.041722812" lastFinishedPulling="2026-03-13 01:19:25.771494402 +0000 UTC m=+26.579615362" observedRunningTime="2026-03-13 01:19:26.160601972 +0000 UTC m=+26.968722932" watchObservedRunningTime="2026-03-13 01:19:29.448607602 +0000 UTC m=+30.256728592" Mar 13 01:19:29.574829 master-0 kubenswrapper[19170]: I0313 01:19:29.574780 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:19:29.576407 master-0 kubenswrapper[19170]: I0313 01:19:29.576376 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.581607 master-0 kubenswrapper[19170]: I0313 01:19:29.581118 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 01:19:29.581607 master-0 kubenswrapper[19170]: I0313 01:19:29.581255 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 01:19:29.581607 master-0 kubenswrapper[19170]: I0313 01:19:29.581353 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-s8vsn" Mar 13 01:19:29.581607 master-0 kubenswrapper[19170]: I0313 01:19:29.581436 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 01:19:29.581607 master-0 kubenswrapper[19170]: I0313 01:19:29.581501 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 01:19:29.587301 master-0 kubenswrapper[19170]: I0313 01:19:29.587264 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 01:19:29.590408 master-0 kubenswrapper[19170]: I0313 01:19:29.590389 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:19:29.704449 master-0 kubenswrapper[19170]: I0313 01:19:29.704288 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.704449 master-0 kubenswrapper[19170]: I0313 01:19:29.704344 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.704449 master-0 kubenswrapper[19170]: I0313 01:19:29.704392 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.704760 master-0 kubenswrapper[19170]: I0313 01:19:29.704493 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.704760 master-0 kubenswrapper[19170]: I0313 01:19:29.704547 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.704760 master-0 kubenswrapper[19170]: I0313 01:19:29.704700 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrtt\" (UniqueName: \"kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.806561 master-0 kubenswrapper[19170]: I0313 01:19:29.806510 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.806561 master-0 kubenswrapper[19170]: I0313 01:19:29.806562 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.806836 master-0 kubenswrapper[19170]: I0313 01:19:29.806592 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.806904 master-0 kubenswrapper[19170]: I0313 01:19:29.806853 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.806961 master-0 kubenswrapper[19170]: I0313 01:19:29.806912 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.807013 master-0 kubenswrapper[19170]: I0313 01:19:29.806977 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrtt\" (UniqueName: \"kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.807690 master-0 kubenswrapper[19170]: I0313 01:19:29.807649 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.807806 master-0 kubenswrapper[19170]: I0313 01:19:29.807694 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.808110 master-0 kubenswrapper[19170]: I0313 01:19:29.808085 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.820708 master-0 kubenswrapper[19170]: I0313 01:19:29.810692 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.820708 master-0 kubenswrapper[19170]: I0313 01:19:29.815601 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.823943 master-0 kubenswrapper[19170]: I0313 01:19:29.823905 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrtt\" (UniqueName: \"kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt\") pod \"console-68ccfc6c58-cjm5c\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:29.941234 master-0 kubenswrapper[19170]: I0313 01:19:29.941176 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:30.365889 master-0 kubenswrapper[19170]: I0313 01:19:30.365107 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:19:30.377110 master-0 kubenswrapper[19170]: W0313 01:19:30.377070 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c9432a_bfa0_4725_8cd1_8cf2967535f5.slice/crio-ebcc7af8dcaf791a1a0e9aee557f3ea2b39086b684a51e6692020026e7866742 WatchSource:0}: Error finding container ebcc7af8dcaf791a1a0e9aee557f3ea2b39086b684a51e6692020026e7866742: Status 404 returned error can't find the container with id ebcc7af8dcaf791a1a0e9aee557f3ea2b39086b684a51e6692020026e7866742 Mar 13 01:19:31.131429 master-0 kubenswrapper[19170]: I0313 01:19:31.131374 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ccfc6c58-cjm5c" event={"ID":"e6c9432a-bfa0-4725-8cd1-8cf2967535f5","Type":"ContainerStarted","Data":"ebcc7af8dcaf791a1a0e9aee557f3ea2b39086b684a51e6692020026e7866742"} Mar 13 01:19:31.910707 master-0 kubenswrapper[19170]: I0313 01:19:31.907664 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:19:31.910707 master-0 kubenswrapper[19170]: I0313 01:19:31.908527 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:31.941306 master-0 kubenswrapper[19170]: I0313 01:19:31.941230 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 01:19:31.952149 master-0 kubenswrapper[19170]: I0313 01:19:31.952114 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:19:32.049581 master-0 kubenswrapper[19170]: I0313 01:19:32.049520 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmjlf\" (UniqueName: \"kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.049837 master-0 kubenswrapper[19170]: I0313 01:19:32.049593 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.049893 master-0 kubenswrapper[19170]: I0313 01:19:32.049819 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.050093 master-0 kubenswrapper[19170]: I0313 01:19:32.050044 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.050235 master-0 kubenswrapper[19170]: I0313 01:19:32.050096 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.050235 master-0 kubenswrapper[19170]: I0313 01:19:32.050218 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.050413 master-0 kubenswrapper[19170]: I0313 01:19:32.050392 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151388 master-0 kubenswrapper[19170]: I0313 01:19:32.151346 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151845 master-0 kubenswrapper[19170]: I0313 01:19:32.151429 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151845 master-0 kubenswrapper[19170]: I0313 01:19:32.151459 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmjlf\" (UniqueName: \"kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151845 master-0 kubenswrapper[19170]: I0313 01:19:32.151481 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151845 master-0 kubenswrapper[19170]: I0313 01:19:32.151497 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.151845 master-0 kubenswrapper[19170]: I0313 01:19:32.151539 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.152051 master-0 kubenswrapper[19170]: I0313 01:19:32.151859 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.152743 master-0 kubenswrapper[19170]: I0313 01:19:32.152681 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.152743 master-0 kubenswrapper[19170]: I0313 01:19:32.152729 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.152987 master-0 kubenswrapper[19170]: I0313 01:19:32.152905 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.153360 master-0 kubenswrapper[19170]: I0313 01:19:32.153325 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.154734 master-0 kubenswrapper[19170]: I0313 01:19:32.154711 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.155561 master-0 kubenswrapper[19170]: I0313 01:19:32.155537 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.170621 master-0 kubenswrapper[19170]: I0313 01:19:32.170546 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmjlf\" (UniqueName: \"kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf\") pod \"console-575758dfc4-r6mb4\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.259595 master-0 kubenswrapper[19170]: I0313 01:19:32.259164 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:32.647640 master-0 kubenswrapper[19170]: I0313 01:19:32.647599 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:19:33.967734 master-0 kubenswrapper[19170]: W0313 01:19:33.967660 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78354ba8_21d5_4774_aa7f_8c72fee1995d.slice/crio-90bd91b509cf7d4f07d52aa508350d4502f27f957a22cb734d93d41581720044 WatchSource:0}: Error finding container 90bd91b509cf7d4f07d52aa508350d4502f27f957a22cb734d93d41581720044: Status 404 returned error can't find the container with id 90bd91b509cf7d4f07d52aa508350d4502f27f957a22cb734d93d41581720044 Mar 13 01:19:34.150469 master-0 kubenswrapper[19170]: I0313 01:19:34.150407 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-575758dfc4-r6mb4" event={"ID":"78354ba8-21d5-4774-aa7f-8c72fee1995d","Type":"ContainerStarted","Data":"90bd91b509cf7d4f07d52aa508350d4502f27f957a22cb734d93d41581720044"} Mar 13 01:19:35.162965 master-0 kubenswrapper[19170]: I0313 01:19:35.162822 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ccfc6c58-cjm5c" event={"ID":"e6c9432a-bfa0-4725-8cd1-8cf2967535f5","Type":"ContainerStarted","Data":"34345f6c94e362f61167d8ab04d3f70a5c2ba66641f5d70d19ac7218528fc827"} Mar 13 01:19:35.166283 master-0 kubenswrapper[19170]: I0313 01:19:35.166237 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-575758dfc4-r6mb4" event={"ID":"78354ba8-21d5-4774-aa7f-8c72fee1995d","Type":"ContainerStarted","Data":"8f8d1117fb3a13d425005f9269dde217d7bd9db550efcafcd9b5d352dea722d9"} Mar 13 01:19:35.204018 master-0 kubenswrapper[19170]: I0313 01:19:35.203908 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68ccfc6c58-cjm5c" podStartSLOduration=2.547906331 podStartE2EDuration="6.203877586s" podCreationTimestamp="2026-03-13 01:19:29 +0000 UTC" firstStartedPulling="2026-03-13 01:19:30.380408619 +0000 UTC m=+31.188529619" lastFinishedPulling="2026-03-13 01:19:34.036379874 +0000 UTC m=+34.844500874" observedRunningTime="2026-03-13 01:19:35.188235395 +0000 UTC m=+35.996356385" watchObservedRunningTime="2026-03-13 01:19:35.203877586 +0000 UTC m=+36.011998586" Mar 13 01:19:37.172260 master-0 kubenswrapper[19170]: I0313 01:19:37.172133 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:19:37.173267 master-0 kubenswrapper[19170]: E0313 01:19:37.172488 19170 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:37.173267 master-0 kubenswrapper[19170]: E0313 01:19:37.172517 19170 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:37.173267 master-0 kubenswrapper[19170]: E0313 01:19:37.172591 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access podName:690f916b-6f87-42d9-8168-392a9177bee9 nodeName:}" failed. No retries permitted until 2026-03-13 01:20:09.172560063 +0000 UTC m=+69.980681023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "690f916b-6f87-42d9-8168-392a9177bee9") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 01:19:39.367620 master-0 kubenswrapper[19170]: I0313 01:19:39.367511 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:39.374956 master-0 kubenswrapper[19170]: I0313 01:19:39.374887 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:19:39.407114 master-0 kubenswrapper[19170]: I0313 01:19:39.407002 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-575758dfc4-r6mb4" podStartSLOduration=8.020299689 podStartE2EDuration="8.406974925s" podCreationTimestamp="2026-03-13 01:19:31 +0000 UTC" firstStartedPulling="2026-03-13 01:19:33.972757601 +0000 UTC m=+34.780878571" lastFinishedPulling="2026-03-13 01:19:34.359432847 +0000 UTC m=+35.167553807" observedRunningTime="2026-03-13 01:19:35.224685963 +0000 UTC m=+36.032806983" watchObservedRunningTime="2026-03-13 01:19:39.406974925 +0000 UTC m=+40.215095925" Mar 13 01:19:39.943467 master-0 kubenswrapper[19170]: I0313 01:19:39.943399 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:39.943467 master-0 kubenswrapper[19170]: I0313 01:19:39.943468 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:19:39.945094 master-0 kubenswrapper[19170]: I0313 01:19:39.945040 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:19:39.945165 master-0 kubenswrapper[19170]: I0313 01:19:39.945128 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:19:41.743495 master-0 kubenswrapper[19170]: I0313 01:19:41.743413 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:19:41.745619 master-0 kubenswrapper[19170]: I0313 01:19:41.744171 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.746764 master-0 kubenswrapper[19170]: I0313 01:19:41.746497 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k6d2j" Mar 13 01:19:41.746946 master-0 kubenswrapper[19170]: I0313 01:19:41.746807 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 01:19:41.808721 master-0 kubenswrapper[19170]: I0313 01:19:41.762507 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:19:41.822642 master-0 kubenswrapper[19170]: I0313 01:19:41.822593 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.822747 master-0 kubenswrapper[19170]: I0313 01:19:41.822682 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.822882 master-0 kubenswrapper[19170]: I0313 01:19:41.822851 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.924697 master-0 kubenswrapper[19170]: I0313 01:19:41.924617 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.924917 master-0 kubenswrapper[19170]: I0313 01:19:41.924750 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.924917 master-0 kubenswrapper[19170]: I0313 01:19:41.924807 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.924917 master-0 kubenswrapper[19170]: I0313 01:19:41.924873 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.924917 master-0 kubenswrapper[19170]: I0313 01:19:41.924887 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:41.942186 master-0 kubenswrapper[19170]: I0313 01:19:41.942110 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access\") pod \"installer-2-master-0\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:42.128424 master-0 kubenswrapper[19170]: I0313 01:19:42.128138 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:19:42.259929 master-0 kubenswrapper[19170]: I0313 01:19:42.259703 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:42.260513 master-0 kubenswrapper[19170]: I0313 01:19:42.260229 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:19:42.263341 master-0 kubenswrapper[19170]: I0313 01:19:42.262946 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:19:42.263341 master-0 kubenswrapper[19170]: I0313 01:19:42.262985 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:19:42.531599 master-0 kubenswrapper[19170]: I0313 01:19:42.530990 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:19:42.542033 master-0 kubenswrapper[19170]: W0313 01:19:42.541984 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod096197ed_4407_4771_bfec_436af289b243.slice/crio-0750f10fe8acda674e75c80eeb8d90960e22757b52d29411e12d11458e9cb58f WatchSource:0}: Error finding container 0750f10fe8acda674e75c80eeb8d90960e22757b52d29411e12d11458e9cb58f: Status 404 returned error can't find the container with id 0750f10fe8acda674e75c80eeb8d90960e22757b52d29411e12d11458e9cb58f Mar 13 01:19:43.227735 master-0 kubenswrapper[19170]: I0313 01:19:43.227682 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"096197ed-4407-4771-bfec-436af289b243","Type":"ContainerStarted","Data":"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9"} Mar 13 01:19:43.228214 master-0 kubenswrapper[19170]: I0313 01:19:43.227749 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"096197ed-4407-4771-bfec-436af289b243","Type":"ContainerStarted","Data":"0750f10fe8acda674e75c80eeb8d90960e22757b52d29411e12d11458e9cb58f"} Mar 13 01:19:43.265267 master-0 kubenswrapper[19170]: I0313 01:19:43.265192 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.265173086 podStartE2EDuration="2.265173086s" podCreationTimestamp="2026-03-13 01:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:19:43.254832014 +0000 UTC m=+44.062953014" watchObservedRunningTime="2026-03-13 01:19:43.265173086 +0000 UTC m=+44.073294046" Mar 13 01:19:47.337283 master-0 kubenswrapper[19170]: I0313 01:19:47.337218 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:19:47.338546 master-0 kubenswrapper[19170]: I0313 01:19:47.338476 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" containerID="cri-o://e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8" gracePeriod=30 Mar 13 01:19:47.359759 master-0 kubenswrapper[19170]: I0313 01:19:47.359253 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:19:47.359759 master-0 kubenswrapper[19170]: I0313 01:19:47.359506 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" podUID="a9462e2e-728d-4076-a876-31dbbd637581" containerName="route-controller-manager" containerID="cri-o://b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6" gracePeriod=30 Mar 13 01:19:47.847171 master-0 kubenswrapper[19170]: I0313 01:19:47.847142 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:47.858272 master-0 kubenswrapper[19170]: I0313 01:19:47.858240 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:48.032038 master-0 kubenswrapper[19170]: I0313 01:19:48.031917 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") pod \"a9462e2e-728d-4076-a876-31dbbd637581\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " Mar 13 01:19:48.032038 master-0 kubenswrapper[19170]: I0313 01:19:48.032004 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") pod \"631f5719-2083-4c99-92cb-2ddc04022d86\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032081 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") pod \"a9462e2e-728d-4076-a876-31dbbd637581\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032105 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") pod \"631f5719-2083-4c99-92cb-2ddc04022d86\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032129 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") pod \"631f5719-2083-4c99-92cb-2ddc04022d86\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032151 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") pod \"631f5719-2083-4c99-92cb-2ddc04022d86\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032171 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") pod \"a9462e2e-728d-4076-a876-31dbbd637581\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032206 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") pod \"631f5719-2083-4c99-92cb-2ddc04022d86\" (UID: \"631f5719-2083-4c99-92cb-2ddc04022d86\") " Mar 13 01:19:48.032251 master-0 kubenswrapper[19170]: I0313 01:19:48.032227 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") pod \"a9462e2e-728d-4076-a876-31dbbd637581\" (UID: \"a9462e2e-728d-4076-a876-31dbbd637581\") " Mar 13 01:19:48.032840 master-0 kubenswrapper[19170]: I0313 01:19:48.032802 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "631f5719-2083-4c99-92cb-2ddc04022d86" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:19:48.033071 master-0 kubenswrapper[19170]: I0313 01:19:48.033038 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9462e2e-728d-4076-a876-31dbbd637581" (UID: "a9462e2e-728d-4076-a876-31dbbd637581"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:19:48.033437 master-0 kubenswrapper[19170]: I0313 01:19:48.033368 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config" (OuterVolumeSpecName: "config") pod "a9462e2e-728d-4076-a876-31dbbd637581" (UID: "a9462e2e-728d-4076-a876-31dbbd637581"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:19:48.033578 master-0 kubenswrapper[19170]: I0313 01:19:48.033533 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config" (OuterVolumeSpecName: "config") pod "631f5719-2083-4c99-92cb-2ddc04022d86" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:19:48.033965 master-0 kubenswrapper[19170]: I0313 01:19:48.033863 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca" (OuterVolumeSpecName: "client-ca") pod "631f5719-2083-4c99-92cb-2ddc04022d86" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:19:48.036420 master-0 kubenswrapper[19170]: I0313 01:19:48.036372 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "631f5719-2083-4c99-92cb-2ddc04022d86" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:19:48.036476 master-0 kubenswrapper[19170]: I0313 01:19:48.036448 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9" (OuterVolumeSpecName: "kube-api-access-sddd9") pod "a9462e2e-728d-4076-a876-31dbbd637581" (UID: "a9462e2e-728d-4076-a876-31dbbd637581"). InnerVolumeSpecName "kube-api-access-sddd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:19:48.037483 master-0 kubenswrapper[19170]: I0313 01:19:48.037450 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9462e2e-728d-4076-a876-31dbbd637581" (UID: "a9462e2e-728d-4076-a876-31dbbd637581"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:19:48.037786 master-0 kubenswrapper[19170]: I0313 01:19:48.037749 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n" (OuterVolumeSpecName: "kube-api-access-jvn9n") pod "631f5719-2083-4c99-92cb-2ddc04022d86" (UID: "631f5719-2083-4c99-92cb-2ddc04022d86"). InnerVolumeSpecName "kube-api-access-jvn9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:19:48.136137 master-0 kubenswrapper[19170]: I0313 01:19:48.135971 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136137 master-0 kubenswrapper[19170]: I0313 01:19:48.136050 19170 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9462e2e-728d-4076-a876-31dbbd637581-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136137 master-0 kubenswrapper[19170]: I0313 01:19:48.136060 19170 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136137 master-0 kubenswrapper[19170]: I0313 01:19:48.136133 19170 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136137 master-0 kubenswrapper[19170]: I0313 01:19:48.136149 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sddd9\" (UniqueName: \"kubernetes.io/projected/a9462e2e-728d-4076-a876-31dbbd637581-kube-api-access-sddd9\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136719 master-0 kubenswrapper[19170]: I0313 01:19:48.136158 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvn9n\" (UniqueName: \"kubernetes.io/projected/631f5719-2083-4c99-92cb-2ddc04022d86-kube-api-access-jvn9n\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136719 master-0 kubenswrapper[19170]: I0313 01:19:48.136168 19170 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/631f5719-2083-4c99-92cb-2ddc04022d86-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136719 master-0 kubenswrapper[19170]: I0313 01:19:48.136177 19170 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/631f5719-2083-4c99-92cb-2ddc04022d86-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.136719 master-0 kubenswrapper[19170]: I0313 01:19:48.136186 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9462e2e-728d-4076-a876-31dbbd637581-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:19:48.273573 master-0 kubenswrapper[19170]: I0313 01:19:48.273499 19170 generic.go:334] "Generic (PLEG): container finished" podID="631f5719-2083-4c99-92cb-2ddc04022d86" containerID="e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8" exitCode=0 Mar 13 01:19:48.273825 master-0 kubenswrapper[19170]: I0313 01:19:48.273588 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" Mar 13 01:19:48.273825 master-0 kubenswrapper[19170]: I0313 01:19:48.273629 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerDied","Data":"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8"} Mar 13 01:19:48.273825 master-0 kubenswrapper[19170]: I0313 01:19:48.273706 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-757fb68448-cj9p5" event={"ID":"631f5719-2083-4c99-92cb-2ddc04022d86","Type":"ContainerDied","Data":"16477c5f389a1fdfcf2af6bfe8b7efe63c0f62df56e3f2ed990e9acc1a597b7d"} Mar 13 01:19:48.273825 master-0 kubenswrapper[19170]: I0313 01:19:48.273727 19170 scope.go:117] "RemoveContainer" containerID="e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8" Mar 13 01:19:48.280734 master-0 kubenswrapper[19170]: I0313 01:19:48.277197 19170 generic.go:334] "Generic (PLEG): container finished" podID="a9462e2e-728d-4076-a876-31dbbd637581" containerID="b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6" exitCode=0 Mar 13 01:19:48.280734 master-0 kubenswrapper[19170]: I0313 01:19:48.277245 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerDied","Data":"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6"} Mar 13 01:19:48.280734 master-0 kubenswrapper[19170]: I0313 01:19:48.277291 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" event={"ID":"a9462e2e-728d-4076-a876-31dbbd637581","Type":"ContainerDied","Data":"35121dd9298456a2fd716c2169a2e2eb4131993976aa53fb1bfd36bc3158f01e"} Mar 13 01:19:48.280734 master-0 kubenswrapper[19170]: I0313 01:19:48.277316 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" Mar 13 01:19:48.292790 master-0 kubenswrapper[19170]: I0313 01:19:48.292735 19170 scope.go:117] "RemoveContainer" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" Mar 13 01:19:48.310324 master-0 kubenswrapper[19170]: I0313 01:19:48.310235 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:19:48.312247 master-0 kubenswrapper[19170]: I0313 01:19:48.312192 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-757fb68448-cj9p5"] Mar 13 01:19:48.335650 master-0 kubenswrapper[19170]: I0313 01:19:48.335595 19170 scope.go:117] "RemoveContainer" containerID="e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8" Mar 13 01:19:48.336297 master-0 kubenswrapper[19170]: E0313 01:19:48.336259 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8\": container with ID starting with e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8 not found: ID does not exist" containerID="e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8" Mar 13 01:19:48.336356 master-0 kubenswrapper[19170]: I0313 01:19:48.336326 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8"} err="failed to get container status \"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8\": rpc error: code = NotFound desc = could not find container \"e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8\": container with ID starting with e14a5c9268dcf9e208f9ac3a76c232808e5f1b8857d2cf23ad9b91be0a6dacf8 not found: ID does not exist" Mar 13 01:19:48.336409 master-0 kubenswrapper[19170]: I0313 01:19:48.336362 19170 scope.go:117] "RemoveContainer" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" Mar 13 01:19:48.336902 master-0 kubenswrapper[19170]: E0313 01:19:48.336794 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d\": container with ID starting with 06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d not found: ID does not exist" containerID="06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d" Mar 13 01:19:48.337129 master-0 kubenswrapper[19170]: I0313 01:19:48.336997 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d"} err="failed to get container status \"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d\": rpc error: code = NotFound desc = could not find container \"06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d\": container with ID starting with 06ac9b51646581f1c80a1fe6c8f1464c3cd508ac9e8d30abbb2fa8a9cb8c167d not found: ID does not exist" Mar 13 01:19:48.337129 master-0 kubenswrapper[19170]: I0313 01:19:48.337032 19170 scope.go:117] "RemoveContainer" containerID="b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6" Mar 13 01:19:48.337843 master-0 kubenswrapper[19170]: I0313 01:19:48.337818 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:19:48.340576 master-0 kubenswrapper[19170]: I0313 01:19:48.340527 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m"] Mar 13 01:19:48.354307 master-0 kubenswrapper[19170]: I0313 01:19:48.354202 19170 scope.go:117] "RemoveContainer" containerID="b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6" Mar 13 01:19:48.354684 master-0 kubenswrapper[19170]: E0313 01:19:48.354645 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6\": container with ID starting with b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6 not found: ID does not exist" containerID="b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6" Mar 13 01:19:48.354737 master-0 kubenswrapper[19170]: I0313 01:19:48.354685 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6"} err="failed to get container status \"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6\": rpc error: code = NotFound desc = could not find container \"b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6\": container with ID starting with b84f9a3482a3526292f0e31987fb336bbf05fad12c589b42051a3df366361ed6 not found: ID does not exist" Mar 13 01:19:48.673450 master-0 kubenswrapper[19170]: I0313 01:19:48.673379 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg"] Mar 13 01:19:48.673797 master-0 kubenswrapper[19170]: E0313 01:19:48.673767 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.673797 master-0 kubenswrapper[19170]: I0313 01:19:48.673795 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: E0313 01:19:48.673817 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: I0313 01:19:48.673827 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: E0313 01:19:48.673873 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9462e2e-728d-4076-a876-31dbbd637581" containerName="route-controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: I0313 01:19:48.673882 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9462e2e-728d-4076-a876-31dbbd637581" containerName="route-controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: I0313 01:19:48.674023 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: I0313 01:19:48.674035 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" containerName="controller-manager" Mar 13 01:19:48.674069 master-0 kubenswrapper[19170]: I0313 01:19:48.674059 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9462e2e-728d-4076-a876-31dbbd637581" containerName="route-controller-manager" Mar 13 01:19:48.674706 master-0 kubenswrapper[19170]: I0313 01:19:48.674589 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.678766 master-0 kubenswrapper[19170]: I0313 01:19:48.678004 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:19:48.678766 master-0 kubenswrapper[19170]: I0313 01:19:48.678077 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:19:48.678766 master-0 kubenswrapper[19170]: I0313 01:19:48.678451 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-bjp2n" Mar 13 01:19:48.678766 master-0 kubenswrapper[19170]: I0313 01:19:48.678500 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:19:48.678766 master-0 kubenswrapper[19170]: I0313 01:19:48.678725 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:19:48.683619 master-0 kubenswrapper[19170]: I0313 01:19:48.682581 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:19:48.689791 master-0 kubenswrapper[19170]: I0313 01:19:48.685692 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc"] Mar 13 01:19:48.689791 master-0 kubenswrapper[19170]: I0313 01:19:48.688902 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.690422 master-0 kubenswrapper[19170]: I0313 01:19:48.690167 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:19:48.691811 master-0 kubenswrapper[19170]: I0313 01:19:48.691741 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-b6b87" Mar 13 01:19:48.691972 master-0 kubenswrapper[19170]: I0313 01:19:48.691847 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:19:48.692411 master-0 kubenswrapper[19170]: I0313 01:19:48.692257 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:19:48.692411 master-0 kubenswrapper[19170]: I0313 01:19:48.692320 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:19:48.693749 master-0 kubenswrapper[19170]: I0313 01:19:48.692489 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:19:48.693749 master-0 kubenswrapper[19170]: I0313 01:19:48.692577 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:19:48.693749 master-0 kubenswrapper[19170]: I0313 01:19:48.692810 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg"] Mar 13 01:19:48.699114 master-0 kubenswrapper[19170]: I0313 01:19:48.698731 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc"] Mar 13 01:19:48.743743 master-0 kubenswrapper[19170]: I0313 01:19:48.743680 19170 patch_prober.go:28] interesting pod/route-controller-manager-5dc55b5d9c-nlg6m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:19:48.743898 master-0 kubenswrapper[19170]: I0313 01:19:48.743749 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5dc55b5d9c-nlg6m" podUID="a9462e2e-728d-4076-a876-31dbbd637581" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.57:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:19:48.845887 master-0 kubenswrapper[19170]: I0313 01:19:48.845818 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-config\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.845887 master-0 kubenswrapper[19170]: I0313 01:19:48.845870 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6745198c-7559-45e5-af6c-1eb493a0a496-serving-cert\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.845932 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b548ab-06c2-471f-926b-dd622019de24-serving-cert\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.845966 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-config\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.845998 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-proxy-ca-bundles\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.846043 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bmx\" (UniqueName: \"kubernetes.io/projected/47b548ab-06c2-471f-926b-dd622019de24-kube-api-access-94bmx\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.846070 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkmpz\" (UniqueName: \"kubernetes.io/projected/6745198c-7559-45e5-af6c-1eb493a0a496-kube-api-access-wkmpz\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.846093 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-client-ca\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.846218 master-0 kubenswrapper[19170]: I0313 01:19:48.846126 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-client-ca\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.947493 master-0 kubenswrapper[19170]: I0313 01:19:48.947362 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-proxy-ca-bundles\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.947770 master-0 kubenswrapper[19170]: I0313 01:19:48.947689 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94bmx\" (UniqueName: \"kubernetes.io/projected/47b548ab-06c2-471f-926b-dd622019de24-kube-api-access-94bmx\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.947865 master-0 kubenswrapper[19170]: I0313 01:19:48.947803 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkmpz\" (UniqueName: \"kubernetes.io/projected/6745198c-7559-45e5-af6c-1eb493a0a496-kube-api-access-wkmpz\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.947934 master-0 kubenswrapper[19170]: I0313 01:19:48.947859 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-client-ca\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.948000 master-0 kubenswrapper[19170]: I0313 01:19:48.947960 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-client-ca\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.948101 master-0 kubenswrapper[19170]: I0313 01:19:48.948075 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-config\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.948166 master-0 kubenswrapper[19170]: I0313 01:19:48.948129 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6745198c-7559-45e5-af6c-1eb493a0a496-serving-cert\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.948288 master-0 kubenswrapper[19170]: I0313 01:19:48.948249 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b548ab-06c2-471f-926b-dd622019de24-serving-cert\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.948355 master-0 kubenswrapper[19170]: I0313 01:19:48.948327 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-config\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.948988 master-0 kubenswrapper[19170]: I0313 01:19:48.948944 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-client-ca\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.949391 master-0 kubenswrapper[19170]: I0313 01:19:48.949318 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-client-ca\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.951455 master-0 kubenswrapper[19170]: I0313 01:19:48.951408 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b548ab-06c2-471f-926b-dd622019de24-config\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.951788 master-0 kubenswrapper[19170]: I0313 01:19:48.951752 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-proxy-ca-bundles\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.953095 master-0 kubenswrapper[19170]: I0313 01:19:48.953040 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6745198c-7559-45e5-af6c-1eb493a0a496-config\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.962209 master-0 kubenswrapper[19170]: I0313 01:19:48.962151 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6745198c-7559-45e5-af6c-1eb493a0a496-serving-cert\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.962390 master-0 kubenswrapper[19170]: I0313 01:19:48.962328 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47b548ab-06c2-471f-926b-dd622019de24-serving-cert\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:48.966128 master-0 kubenswrapper[19170]: I0313 01:19:48.966067 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkmpz\" (UniqueName: \"kubernetes.io/projected/6745198c-7559-45e5-af6c-1eb493a0a496-kube-api-access-wkmpz\") pod \"controller-manager-d8dbf7c4d-v2gdg\" (UID: \"6745198c-7559-45e5-af6c-1eb493a0a496\") " pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:48.967321 master-0 kubenswrapper[19170]: I0313 01:19:48.967266 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bmx\" (UniqueName: \"kubernetes.io/projected/47b548ab-06c2-471f-926b-dd622019de24-kube-api-access-94bmx\") pod \"route-controller-manager-6bbc74ffc7-zd8vc\" (UID: \"47b548ab-06c2-471f-926b-dd622019de24\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:49.013663 master-0 kubenswrapper[19170]: I0313 01:19:49.013545 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:19:49.064377 master-0 kubenswrapper[19170]: I0313 01:19:49.064316 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:19:49.431071 master-0 kubenswrapper[19170]: I0313 01:19:49.430997 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631f5719-2083-4c99-92cb-2ddc04022d86" path="/var/lib/kubelet/pods/631f5719-2083-4c99-92cb-2ddc04022d86/volumes" Mar 13 01:19:49.432243 master-0 kubenswrapper[19170]: I0313 01:19:49.432196 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9462e2e-728d-4076-a876-31dbbd637581" path="/var/lib/kubelet/pods/a9462e2e-728d-4076-a876-31dbbd637581/volumes" Mar 13 01:19:49.943577 master-0 kubenswrapper[19170]: I0313 01:19:49.943527 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:19:49.943781 master-0 kubenswrapper[19170]: I0313 01:19:49.943610 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:19:52.260496 master-0 kubenswrapper[19170]: I0313 01:19:52.260413 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:19:52.261263 master-0 kubenswrapper[19170]: I0313 01:19:52.260515 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:19:52.436303 master-0 kubenswrapper[19170]: I0313 01:19:52.436208 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:19:52.436823 master-0 kubenswrapper[19170]: I0313 01:19:52.436614 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager" containerID="cri-o://1f339c4da756baa27443d470023f6e1410367639df127c26cecf3952f778ca16" gracePeriod=30 Mar 13 01:19:52.436923 master-0 kubenswrapper[19170]: I0313 01:19:52.436809 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://9d118311f33a13bf01d58b99e2e28890870103c6d6d9e80b3f327feb4a6e5c10" gracePeriod=30 Mar 13 01:19:52.437010 master-0 kubenswrapper[19170]: I0313 01:19:52.436770 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://7146d2748a69888b0e230f968d6a455dc052e3a4f925338980f5ac24afb23fd4" gracePeriod=30 Mar 13 01:19:52.437144 master-0 kubenswrapper[19170]: I0313 01:19:52.437104 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b8addabb4549d659b9d15764990d3747" containerName="cluster-policy-controller" containerID="cri-o://6ecf1cf9a4925a48c1305c992f6b26c6dc5493f27b0413a75a2a0cbd559a27b9" gracePeriod=30 Mar 13 01:19:52.438160 master-0 kubenswrapper[19170]: I0313 01:19:52.437805 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:19:52.438234 master-0 kubenswrapper[19170]: E0313 01:19:52.438180 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager" Mar 13 01:19:52.438234 master-0 kubenswrapper[19170]: I0313 01:19:52.438199 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager" Mar 13 01:19:52.438234 master-0 kubenswrapper[19170]: E0313 01:19:52.438229 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8addabb4549d659b9d15764990d3747" containerName="cluster-policy-controller" Mar 13 01:19:52.438414 master-0 kubenswrapper[19170]: I0313 01:19:52.438245 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8addabb4549d659b9d15764990d3747" containerName="cluster-policy-controller" Mar 13 01:19:52.438414 master-0 kubenswrapper[19170]: E0313 01:19:52.438269 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-recovery-controller" Mar 13 01:19:52.438414 master-0 kubenswrapper[19170]: I0313 01:19:52.438283 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-recovery-controller" Mar 13 01:19:52.438414 master-0 kubenswrapper[19170]: E0313 01:19:52.438303 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-cert-syncer" Mar 13 01:19:52.438414 master-0 kubenswrapper[19170]: I0313 01:19:52.438315 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-cert-syncer" Mar 13 01:19:52.438740 master-0 kubenswrapper[19170]: I0313 01:19:52.438557 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager" Mar 13 01:19:52.438740 master-0 kubenswrapper[19170]: I0313 01:19:52.438584 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8addabb4549d659b9d15764990d3747" containerName="cluster-policy-controller" Mar 13 01:19:52.438740 master-0 kubenswrapper[19170]: I0313 01:19:52.438613 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-cert-syncer" Mar 13 01:19:52.438740 master-0 kubenswrapper[19170]: I0313 01:19:52.438678 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8addabb4549d659b9d15764990d3747" containerName="kube-controller-manager-recovery-controller" Mar 13 01:19:52.524328 master-0 kubenswrapper[19170]: I0313 01:19:52.524214 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:52.524328 master-0 kubenswrapper[19170]: I0313 01:19:52.524289 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:52.625038 master-0 kubenswrapper[19170]: I0313 01:19:52.624976 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:52.625038 master-0 kubenswrapper[19170]: I0313 01:19:52.625038 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:52.625318 master-0 kubenswrapper[19170]: I0313 01:19:52.625097 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:52.625318 master-0 kubenswrapper[19170]: I0313 01:19:52.625169 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:19:55.941158 master-0 kubenswrapper[19170]: I0313 01:19:55.937972 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:19:55.941158 master-0 kubenswrapper[19170]: I0313 01:19:55.938170 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="096197ed-4407-4771-bfec-436af289b243" containerName="installer" containerID="cri-o://49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9" gracePeriod=30 Mar 13 01:19:59.378882 master-0 kubenswrapper[19170]: I0313 01:19:59.378813 19170 scope.go:117] "RemoveContainer" containerID="be52d87237e2c88231046564bd2dfcdbd780faa45f3647245e1d0a9837eb7182" Mar 13 01:19:59.408282 master-0 kubenswrapper[19170]: I0313 01:19:59.408233 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 01:19:59.409039 master-0 kubenswrapper[19170]: I0313 01:19:59.409024 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.420221 master-0 kubenswrapper[19170]: I0313 01:19:59.419942 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 01:19:59.465354 master-0 kubenswrapper[19170]: I0313 01:19:59.465313 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.465563 master-0 kubenswrapper[19170]: I0313 01:19:59.465543 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.465715 master-0 kubenswrapper[19170]: I0313 01:19:59.465604 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.567218 master-0 kubenswrapper[19170]: I0313 01:19:59.567166 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.567403 master-0 kubenswrapper[19170]: I0313 01:19:59.567314 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.567403 master-0 kubenswrapper[19170]: I0313 01:19:59.567358 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.567469 master-0 kubenswrapper[19170]: I0313 01:19:59.567304 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.567469 master-0 kubenswrapper[19170]: I0313 01:19:59.567373 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.594407 master-0 kubenswrapper[19170]: I0313 01:19:59.594026 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.787820 master-0 kubenswrapper[19170]: I0313 01:19:59.787761 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:19:59.943782 master-0 kubenswrapper[19170]: I0313 01:19:59.943720 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:19:59.943967 master-0 kubenswrapper[19170]: I0313 01:19:59.943819 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:01.389825 master-0 kubenswrapper[19170]: I0313 01:20:01.389758 19170 generic.go:334] "Generic (PLEG): container finished" podID="bf1a1318-e714-42a4-82b6-17265862b2a5" containerID="7f31e173b6981d921b01c4ef90be95351f61662568e535b5423ed55139bc69e0" exitCode=0 Mar 13 01:20:01.390388 master-0 kubenswrapper[19170]: I0313 01:20:01.389861 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"bf1a1318-e714-42a4-82b6-17265862b2a5","Type":"ContainerDied","Data":"7f31e173b6981d921b01c4ef90be95351f61662568e535b5423ed55139bc69e0"} Mar 13 01:20:01.395002 master-0 kubenswrapper[19170]: I0313 01:20:01.394955 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_b8addabb4549d659b9d15764990d3747/kube-controller-manager-cert-syncer/0.log" Mar 13 01:20:01.396785 master-0 kubenswrapper[19170]: I0313 01:20:01.396567 19170 generic.go:334] "Generic (PLEG): container finished" podID="b8addabb4549d659b9d15764990d3747" containerID="9d118311f33a13bf01d58b99e2e28890870103c6d6d9e80b3f327feb4a6e5c10" exitCode=0 Mar 13 01:20:01.396785 master-0 kubenswrapper[19170]: I0313 01:20:01.396594 19170 generic.go:334] "Generic (PLEG): container finished" podID="b8addabb4549d659b9d15764990d3747" containerID="7146d2748a69888b0e230f968d6a455dc052e3a4f925338980f5ac24afb23fd4" exitCode=2 Mar 13 01:20:01.396785 master-0 kubenswrapper[19170]: I0313 01:20:01.396606 19170 generic.go:334] "Generic (PLEG): container finished" podID="b8addabb4549d659b9d15764990d3747" containerID="6ecf1cf9a4925a48c1305c992f6b26c6dc5493f27b0413a75a2a0cbd559a27b9" exitCode=0 Mar 13 01:20:01.396785 master-0 kubenswrapper[19170]: I0313 01:20:01.396617 19170 generic.go:334] "Generic (PLEG): container finished" podID="b8addabb4549d659b9d15764990d3747" containerID="1f339c4da756baa27443d470023f6e1410367639df127c26cecf3952f778ca16" exitCode=0 Mar 13 01:20:01.628452 master-0 kubenswrapper[19170]: I0313 01:20:01.628408 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_b8addabb4549d659b9d15764990d3747/kube-controller-manager-cert-syncer/0.log" Mar 13 01:20:01.629111 master-0 kubenswrapper[19170]: I0313 01:20:01.629082 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.809763 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") pod \"b8addabb4549d659b9d15764990d3747\" (UID: \"b8addabb4549d659b9d15764990d3747\") " Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.809895 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b8addabb4549d659b9d15764990d3747" (UID: "b8addabb4549d659b9d15764990d3747"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.809954 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") pod \"b8addabb4549d659b9d15764990d3747\" (UID: \"b8addabb4549d659b9d15764990d3747\") " Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.809987 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b8addabb4549d659b9d15764990d3747" (UID: "b8addabb4549d659b9d15764990d3747"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.810404 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:01.810466 master-0 kubenswrapper[19170]: I0313 01:20:01.810427 19170 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b8addabb4549d659b9d15764990d3747-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:02.275035 master-0 kubenswrapper[19170]: I0313 01:20:02.274961 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:02.275232 master-0 kubenswrapper[19170]: I0313 01:20:02.275039 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:02.408772 master-0 kubenswrapper[19170]: I0313 01:20:02.408699 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_b8addabb4549d659b9d15764990d3747/kube-controller-manager-cert-syncer/0.log" Mar 13 01:20:02.410793 master-0 kubenswrapper[19170]: I0313 01:20:02.410727 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:02.410958 master-0 kubenswrapper[19170]: I0313 01:20:02.410873 19170 scope.go:117] "RemoveContainer" containerID="9d118311f33a13bf01d58b99e2e28890870103c6d6d9e80b3f327feb4a6e5c10" Mar 13 01:20:02.471683 master-0 kubenswrapper[19170]: I0313 01:20:02.471574 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="b8addabb4549d659b9d15764990d3747" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:20:02.492136 master-0 kubenswrapper[19170]: I0313 01:20:02.490078 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg"] Mar 13 01:20:02.505734 master-0 kubenswrapper[19170]: I0313 01:20:02.501746 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc"] Mar 13 01:20:02.505734 master-0 kubenswrapper[19170]: I0313 01:20:02.502787 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 01:20:02.752051 master-0 kubenswrapper[19170]: I0313 01:20:02.744148 19170 scope.go:117] "RemoveContainer" containerID="7146d2748a69888b0e230f968d6a455dc052e3a4f925338980f5ac24afb23fd4" Mar 13 01:20:03.031519 master-0 kubenswrapper[19170]: I0313 01:20:03.031030 19170 scope.go:117] "RemoveContainer" containerID="6ecf1cf9a4925a48c1305c992f6b26c6dc5493f27b0413a75a2a0cbd559a27b9" Mar 13 01:20:03.096900 master-0 kubenswrapper[19170]: I0313 01:20:03.096443 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:20:03.114907 master-0 kubenswrapper[19170]: I0313 01:20:03.114854 19170 scope.go:117] "RemoveContainer" containerID="1f339c4da756baa27443d470023f6e1410367639df127c26cecf3952f778ca16" Mar 13 01:20:03.235110 master-0 kubenswrapper[19170]: I0313 01:20:03.233833 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir\") pod \"bf1a1318-e714-42a4-82b6-17265862b2a5\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " Mar 13 01:20:03.235110 master-0 kubenswrapper[19170]: I0313 01:20:03.233939 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access\") pod \"bf1a1318-e714-42a4-82b6-17265862b2a5\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " Mar 13 01:20:03.235110 master-0 kubenswrapper[19170]: I0313 01:20:03.233985 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock\") pod \"bf1a1318-e714-42a4-82b6-17265862b2a5\" (UID: \"bf1a1318-e714-42a4-82b6-17265862b2a5\") " Mar 13 01:20:03.235110 master-0 kubenswrapper[19170]: I0313 01:20:03.234613 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock" (OuterVolumeSpecName: "var-lock") pod "bf1a1318-e714-42a4-82b6-17265862b2a5" (UID: "bf1a1318-e714-42a4-82b6-17265862b2a5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:03.235110 master-0 kubenswrapper[19170]: I0313 01:20:03.234664 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bf1a1318-e714-42a4-82b6-17265862b2a5" (UID: "bf1a1318-e714-42a4-82b6-17265862b2a5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:03.245955 master-0 kubenswrapper[19170]: I0313 01:20:03.243892 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bf1a1318-e714-42a4-82b6-17265862b2a5" (UID: "bf1a1318-e714-42a4-82b6-17265862b2a5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:20:03.335609 master-0 kubenswrapper[19170]: I0313 01:20:03.335567 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:03.335609 master-0 kubenswrapper[19170]: I0313 01:20:03.335610 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf1a1318-e714-42a4-82b6-17265862b2a5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:03.335838 master-0 kubenswrapper[19170]: I0313 01:20:03.335625 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf1a1318-e714-42a4-82b6-17265862b2a5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:03.429269 master-0 kubenswrapper[19170]: I0313 01:20:03.426533 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 01:20:03.433652 master-0 kubenswrapper[19170]: I0313 01:20:03.433591 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8addabb4549d659b9d15764990d3747" path="/var/lib/kubelet/pods/b8addabb4549d659b9d15764990d3747/volumes" Mar 13 01:20:03.435202 master-0 kubenswrapper[19170]: I0313 01:20:03.435163 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" event={"ID":"47b548ab-06c2-471f-926b-dd622019de24","Type":"ContainerStarted","Data":"13f3a0878377a4aaad3bd58762060fb9d64b32bfab2c6675de8a2ea44e7fd187"} Mar 13 01:20:03.435268 master-0 kubenswrapper[19170]: I0313 01:20:03.435212 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" event={"ID":"6745198c-7559-45e5-af6c-1eb493a0a496","Type":"ContainerStarted","Data":"0144943a2c5123b0b8e5c45e867c4752dde3ef22f384e51497f03888456d16e5"} Mar 13 01:20:03.435268 master-0 kubenswrapper[19170]: I0313 01:20:03.435238 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" event={"ID":"6745198c-7559-45e5-af6c-1eb493a0a496","Type":"ContainerStarted","Data":"9d09567e496ffb5249274e0f2e415b4658d2fd0dd3708ced1d1d45688a2d2567"} Mar 13 01:20:03.435338 master-0 kubenswrapper[19170]: I0313 01:20:03.435267 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"bf1a1318-e714-42a4-82b6-17265862b2a5","Type":"ContainerDied","Data":"18d390ccbea0e205cf30f6a90982255d83fa8f8518bae570e74823ab95c85e9d"} Mar 13 01:20:03.435338 master-0 kubenswrapper[19170]: I0313 01:20:03.435295 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d390ccbea0e205cf30f6a90982255d83fa8f8518bae570e74823ab95c85e9d" Mar 13 01:20:03.435338 master-0 kubenswrapper[19170]: I0313 01:20:03.435320 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d","Type":"ContainerStarted","Data":"b38bde1c90fe45f7f2c5d512e526862834d38953f345f5516890ee34a8b7f4be"} Mar 13 01:20:04.443260 master-0 kubenswrapper[19170]: I0313 01:20:04.443175 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-5k2pr" event={"ID":"81d4d1af-7da3-4848-b566-39741e905928","Type":"ContainerStarted","Data":"946f365355e8b1ebd4c285334b4de66c21be9de85d0e4f65cde78a72d49de460"} Mar 13 01:20:04.443899 master-0 kubenswrapper[19170]: I0313 01:20:04.443883 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:20:04.446002 master-0 kubenswrapper[19170]: I0313 01:20:04.445958 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" event={"ID":"47b548ab-06c2-471f-926b-dd622019de24","Type":"ContainerStarted","Data":"239c2e150c6c5dfc3628f5b3af0ffdd7b1fe942ee1afe4a69456f807ea6f38d4"} Mar 13 01:20:04.446213 master-0 kubenswrapper[19170]: I0313 01:20:04.446183 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:20:04.446362 master-0 kubenswrapper[19170]: I0313 01:20:04.446321 19170 patch_prober.go:28] interesting pod/downloads-84f57b9877-5k2pr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 13 01:20:04.446412 master-0 kubenswrapper[19170]: I0313 01:20:04.446382 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-5k2pr" podUID="81d4d1af-7da3-4848-b566-39741e905928" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 13 01:20:04.449233 master-0 kubenswrapper[19170]: I0313 01:20:04.449218 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d","Type":"ContainerStarted","Data":"93070398078caeb2ec5143258964a2454e6befcc698ae218156e7b1ae60be571"} Mar 13 01:20:04.449420 master-0 kubenswrapper[19170]: I0313 01:20:04.449406 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:20:04.457124 master-0 kubenswrapper[19170]: I0313 01:20:04.457027 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:20:04.788089 master-0 kubenswrapper[19170]: I0313 01:20:04.787968 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" Mar 13 01:20:05.107405 master-0 kubenswrapper[19170]: I0313 01:20:05.107263 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-5k2pr" podStartSLOduration=3.453749663 podStartE2EDuration="43.107235902s" podCreationTimestamp="2026-03-13 01:19:22 +0000 UTC" firstStartedPulling="2026-03-13 01:19:23.600328523 +0000 UTC m=+24.408449483" lastFinishedPulling="2026-03-13 01:20:03.253814762 +0000 UTC m=+64.061935722" observedRunningTime="2026-03-13 01:20:04.776492349 +0000 UTC m=+65.584613349" watchObservedRunningTime="2026-03-13 01:20:05.107235902 +0000 UTC m=+65.915356892" Mar 13 01:20:05.110310 master-0 kubenswrapper[19170]: I0313 01:20:05.110269 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=6.110260383 podStartE2EDuration="6.110260383s" podCreationTimestamp="2026-03-13 01:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:20:05.106577854 +0000 UTC m=+65.914698804" watchObservedRunningTime="2026-03-13 01:20:05.110260383 +0000 UTC m=+65.918381353" Mar 13 01:20:05.137510 master-0 kubenswrapper[19170]: I0313 01:20:05.137443 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" podStartSLOduration=18.137423435 podStartE2EDuration="18.137423435s" podCreationTimestamp="2026-03-13 01:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:20:05.137136177 +0000 UTC m=+65.945257137" watchObservedRunningTime="2026-03-13 01:20:05.137423435 +0000 UTC m=+65.945544395" Mar 13 01:20:05.161305 master-0 kubenswrapper[19170]: I0313 01:20:05.161237 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bbc74ffc7-zd8vc" podStartSLOduration=18.161216155 podStartE2EDuration="18.161216155s" podCreationTimestamp="2026-03-13 01:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:20:05.160724222 +0000 UTC m=+65.968845182" watchObservedRunningTime="2026-03-13 01:20:05.161216155 +0000 UTC m=+65.969337115" Mar 13 01:20:05.453559 master-0 kubenswrapper[19170]: I0313 01:20:05.453514 19170 patch_prober.go:28] interesting pod/downloads-84f57b9877-5k2pr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 13 01:20:05.454007 master-0 kubenswrapper[19170]: I0313 01:20:05.453570 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-5k2pr" podUID="81d4d1af-7da3-4848-b566-39741e905928" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 13 01:20:06.419402 master-0 kubenswrapper[19170]: I0313 01:20:06.419342 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:06.442428 master-0 kubenswrapper[19170]: I0313 01:20:06.442375 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6ccd962a-c9e2-49a3-943a-21897524cd4a" Mar 13 01:20:06.442428 master-0 kubenswrapper[19170]: I0313 01:20:06.442418 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6ccd962a-c9e2-49a3-943a-21897524cd4a" Mar 13 01:20:06.455890 master-0 kubenswrapper[19170]: I0313 01:20:06.455783 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:20:06.470069 master-0 kubenswrapper[19170]: I0313 01:20:06.469780 19170 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:06.474761 master-0 kubenswrapper[19170]: I0313 01:20:06.474425 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:20:06.482417 master-0 kubenswrapper[19170]: I0313 01:20:06.482364 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:06.489127 master-0 kubenswrapper[19170]: I0313 01:20:06.486143 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:20:07.472924 master-0 kubenswrapper[19170]: I0313 01:20:07.472880 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"ad501dba73c8e02ebebbb7bcaac88dd3b49aa2dc75d4621c004e56223548c569"} Mar 13 01:20:07.472924 master-0 kubenswrapper[19170]: I0313 01:20:07.472923 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"682183e6d4ff8fe19dbc6c9d30df58dc630f77f76ed3714d2f40580e0122e18e"} Mar 13 01:20:07.472924 master-0 kubenswrapper[19170]: I0313 01:20:07.472935 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116"} Mar 13 01:20:07.473457 master-0 kubenswrapper[19170]: I0313 01:20:07.472944 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"61f8bb415a20f5d2771ed5dc744f9982ea8e0a80ebb3d2449dda00410833c158"} Mar 13 01:20:08.484260 master-0 kubenswrapper[19170]: I0313 01:20:08.484212 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"2aa87006447655fcce0c7dda89b0736fa8fe5d6d6b3d7992f1e605fe121770e9"} Mar 13 01:20:08.612230 master-0 kubenswrapper[19170]: I0313 01:20:08.612001 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.61197921 podStartE2EDuration="2.61197921s" podCreationTimestamp="2026-03-13 01:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:20:08.606688538 +0000 UTC m=+69.414809538" watchObservedRunningTime="2026-03-13 01:20:08.61197921 +0000 UTC m=+69.420100180" Mar 13 01:20:09.225250 master-0 kubenswrapper[19170]: I0313 01:20:09.223129 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:20:09.226298 master-0 kubenswrapper[19170]: I0313 01:20:09.226249 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 13 01:20:09.324314 master-0 kubenswrapper[19170]: I0313 01:20:09.324254 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") pod \"690f916b-6f87-42d9-8168-392a9177bee9\" (UID: \"690f916b-6f87-42d9-8168-392a9177bee9\") " Mar 13 01:20:09.326697 master-0 kubenswrapper[19170]: I0313 01:20:09.326645 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "690f916b-6f87-42d9-8168-392a9177bee9" (UID: "690f916b-6f87-42d9-8168-392a9177bee9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:20:09.425998 master-0 kubenswrapper[19170]: I0313 01:20:09.425952 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/690f916b-6f87-42d9-8168-392a9177bee9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:09.943900 master-0 kubenswrapper[19170]: I0313 01:20:09.943808 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:09.944387 master-0 kubenswrapper[19170]: I0313 01:20:09.943914 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:12.260053 master-0 kubenswrapper[19170]: I0313 01:20:12.259982 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:12.260053 master-0 kubenswrapper[19170]: I0313 01:20:12.260053 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:13.199977 master-0 kubenswrapper[19170]: I0313 01:20:13.199933 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-5k2pr" Mar 13 01:20:13.871572 master-0 kubenswrapper[19170]: E0313 01:20:13.871527 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod096197ed_4407_4771_bfec_436af289b243.slice/crio-conmon-49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod096197ed_4407_4771_bfec_436af289b243.slice/crio-49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:20:13.872911 master-0 kubenswrapper[19170]: E0313 01:20:13.872859 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod096197ed_4407_4771_bfec_436af289b243.slice/crio-49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-podbf1a1318_e714_42a4_82b6_17265862b2a5.slice/crio-18d390ccbea0e205cf30f6a90982255d83fa8f8518bae570e74823ab95c85e9d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-podbf1a1318_e714_42a4_82b6_17265862b2a5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod096197ed_4407_4771_bfec_436af289b243.slice/crio-conmon-49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:20:14.154811 master-0 kubenswrapper[19170]: I0313 01:20:14.154745 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_096197ed-4407-4771-bfec-436af289b243/installer/0.log" Mar 13 01:20:14.155022 master-0 kubenswrapper[19170]: I0313 01:20:14.154842 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:20:14.304188 master-0 kubenswrapper[19170]: I0313 01:20:14.304084 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access\") pod \"096197ed-4407-4771-bfec-436af289b243\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " Mar 13 01:20:14.304466 master-0 kubenswrapper[19170]: I0313 01:20:14.304232 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock\") pod \"096197ed-4407-4771-bfec-436af289b243\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " Mar 13 01:20:14.304466 master-0 kubenswrapper[19170]: I0313 01:20:14.304342 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir\") pod \"096197ed-4407-4771-bfec-436af289b243\" (UID: \"096197ed-4407-4771-bfec-436af289b243\") " Mar 13 01:20:14.304466 master-0 kubenswrapper[19170]: I0313 01:20:14.304367 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock" (OuterVolumeSpecName: "var-lock") pod "096197ed-4407-4771-bfec-436af289b243" (UID: "096197ed-4407-4771-bfec-436af289b243"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:14.304466 master-0 kubenswrapper[19170]: I0313 01:20:14.304402 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "096197ed-4407-4771-bfec-436af289b243" (UID: "096197ed-4407-4771-bfec-436af289b243"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:14.304872 master-0 kubenswrapper[19170]: I0313 01:20:14.304816 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:14.304956 master-0 kubenswrapper[19170]: I0313 01:20:14.304868 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/096197ed-4407-4771-bfec-436af289b243-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:14.311170 master-0 kubenswrapper[19170]: I0313 01:20:14.311108 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "096197ed-4407-4771-bfec-436af289b243" (UID: "096197ed-4407-4771-bfec-436af289b243"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:20:14.407380 master-0 kubenswrapper[19170]: I0313 01:20:14.407210 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/096197ed-4407-4771-bfec-436af289b243-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:14.532877 master-0 kubenswrapper[19170]: I0313 01:20:14.532516 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_096197ed-4407-4771-bfec-436af289b243/installer/0.log" Mar 13 01:20:14.532877 master-0 kubenswrapper[19170]: I0313 01:20:14.532582 19170 generic.go:334] "Generic (PLEG): container finished" podID="096197ed-4407-4771-bfec-436af289b243" containerID="49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9" exitCode=1 Mar 13 01:20:14.532877 master-0 kubenswrapper[19170]: I0313 01:20:14.532621 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"096197ed-4407-4771-bfec-436af289b243","Type":"ContainerDied","Data":"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9"} Mar 13 01:20:14.532877 master-0 kubenswrapper[19170]: I0313 01:20:14.532691 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"096197ed-4407-4771-bfec-436af289b243","Type":"ContainerDied","Data":"0750f10fe8acda674e75c80eeb8d90960e22757b52d29411e12d11458e9cb58f"} Mar 13 01:20:14.532877 master-0 kubenswrapper[19170]: I0313 01:20:14.532720 19170 scope.go:117] "RemoveContainer" containerID="49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9" Mar 13 01:20:14.538322 master-0 kubenswrapper[19170]: I0313 01:20:14.533553 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 01:20:14.570207 master-0 kubenswrapper[19170]: I0313 01:20:14.570147 19170 scope.go:117] "RemoveContainer" containerID="49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9" Mar 13 01:20:14.571032 master-0 kubenswrapper[19170]: E0313 01:20:14.570963 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9\": container with ID starting with 49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9 not found: ID does not exist" containerID="49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9" Mar 13 01:20:14.578114 master-0 kubenswrapper[19170]: I0313 01:20:14.571050 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9"} err="failed to get container status \"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9\": rpc error: code = NotFound desc = could not find container \"49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9\": container with ID starting with 49a29482b801c2a846c3708d0930c3fe171c11461fc6e881ef0ac73d356082e9 not found: ID does not exist" Mar 13 01:20:14.584695 master-0 kubenswrapper[19170]: I0313 01:20:14.584619 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:20:14.591488 master-0 kubenswrapper[19170]: I0313 01:20:14.591422 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 01:20:15.432429 master-0 kubenswrapper[19170]: I0313 01:20:15.432364 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096197ed-4407-4771-bfec-436af289b243" path="/var/lib/kubelet/pods/096197ed-4407-4771-bfec-436af289b243/volumes" Mar 13 01:20:16.482494 master-0 kubenswrapper[19170]: I0313 01:20:16.482441 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:16.483349 master-0 kubenswrapper[19170]: I0313 01:20:16.482524 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:16.483349 master-0 kubenswrapper[19170]: I0313 01:20:16.482548 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:16.483349 master-0 kubenswrapper[19170]: I0313 01:20:16.482569 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:16.483349 master-0 kubenswrapper[19170]: I0313 01:20:16.482964 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 01:20:16.483349 master-0 kubenswrapper[19170]: I0313 01:20:16.483061 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 01:20:16.490160 master-0 kubenswrapper[19170]: I0313 01:20:16.490102 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:17.573732 master-0 kubenswrapper[19170]: I0313 01:20:17.572354 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:19.943902 master-0 kubenswrapper[19170]: I0313 01:20:19.943805 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:19.944865 master-0 kubenswrapper[19170]: I0313 01:20:19.943928 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:22.260522 master-0 kubenswrapper[19170]: I0313 01:20:22.260444 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:22.260522 master-0 kubenswrapper[19170]: I0313 01:20:22.260508 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:26.483958 master-0 kubenswrapper[19170]: I0313 01:20:26.483844 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 01:20:26.485400 master-0 kubenswrapper[19170]: I0313 01:20:26.483960 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 01:20:29.943838 master-0 kubenswrapper[19170]: I0313 01:20:29.943748 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:29.944617 master-0 kubenswrapper[19170]: I0313 01:20:29.943829 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:32.261055 master-0 kubenswrapper[19170]: I0313 01:20:32.260945 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:32.261055 master-0 kubenswrapper[19170]: I0313 01:20:32.261037 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:36.483317 master-0 kubenswrapper[19170]: I0313 01:20:36.482843 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 01:20:36.483317 master-0 kubenswrapper[19170]: I0313 01:20:36.482924 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 01:20:36.483317 master-0 kubenswrapper[19170]: I0313 01:20:36.482994 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:20:36.485721 master-0 kubenswrapper[19170]: I0313 01:20:36.483875 19170 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 01:20:36.485721 master-0 kubenswrapper[19170]: I0313 01:20:36.484047 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" containerID="cri-o://c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116" gracePeriod=30 Mar 13 01:20:39.944199 master-0 kubenswrapper[19170]: I0313 01:20:39.944092 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:39.944199 master-0 kubenswrapper[19170]: I0313 01:20:39.944173 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:42.259990 master-0 kubenswrapper[19170]: I0313 01:20:42.259911 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:42.260761 master-0 kubenswrapper[19170]: I0313 01:20:42.260012 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:49.944720 master-0 kubenswrapper[19170]: I0313 01:20:49.944599 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:49.945490 master-0 kubenswrapper[19170]: I0313 01:20:49.944767 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:20:52.260584 master-0 kubenswrapper[19170]: I0313 01:20:52.260474 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:20:52.261472 master-0 kubenswrapper[19170]: I0313 01:20:52.260562 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:20:52.336923 master-0 kubenswrapper[19170]: I0313 01:20:52.336832 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:20:52.337662 master-0 kubenswrapper[19170]: E0313 01:20:52.337590 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1a1318-e714-42a4-82b6-17265862b2a5" containerName="installer" Mar 13 01:20:52.337755 master-0 kubenswrapper[19170]: I0313 01:20:52.337678 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1a1318-e714-42a4-82b6-17265862b2a5" containerName="installer" Mar 13 01:20:52.337755 master-0 kubenswrapper[19170]: E0313 01:20:52.337704 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096197ed-4407-4771-bfec-436af289b243" containerName="installer" Mar 13 01:20:52.337755 master-0 kubenswrapper[19170]: I0313 01:20:52.337719 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="096197ed-4407-4771-bfec-436af289b243" containerName="installer" Mar 13 01:20:52.338169 master-0 kubenswrapper[19170]: I0313 01:20:52.338084 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="096197ed-4407-4771-bfec-436af289b243" containerName="installer" Mar 13 01:20:52.338169 master-0 kubenswrapper[19170]: I0313 01:20:52.338166 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1a1318-e714-42a4-82b6-17265862b2a5" containerName="installer" Mar 13 01:20:52.339069 master-0 kubenswrapper[19170]: I0313 01:20:52.339017 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:20:52.339317 master-0 kubenswrapper[19170]: I0313 01:20:52.339258 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.339528 master-0 kubenswrapper[19170]: I0313 01:20:52.339466 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766" gracePeriod=15 Mar 13 01:20:52.339777 master-0 kubenswrapper[19170]: I0313 01:20:52.339609 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a" gracePeriod=15 Mar 13 01:20:52.339873 master-0 kubenswrapper[19170]: I0313 01:20:52.339728 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a" gracePeriod=15 Mar 13 01:20:52.339873 master-0 kubenswrapper[19170]: I0313 01:20:52.339754 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d" gracePeriod=15 Mar 13 01:20:52.339992 master-0 kubenswrapper[19170]: I0313 01:20:52.339728 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285" gracePeriod=15 Mar 13 01:20:52.341205 master-0 kubenswrapper[19170]: I0313 01:20:52.340990 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:20:52.341771 master-0 kubenswrapper[19170]: E0313 01:20:52.341566 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 13 01:20:52.341771 master-0 kubenswrapper[19170]: I0313 01:20:52.341653 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 13 01:20:52.341771 master-0 kubenswrapper[19170]: E0313 01:20:52.341759 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 13 01:20:52.341988 master-0 kubenswrapper[19170]: I0313 01:20:52.341788 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 13 01:20:52.341988 master-0 kubenswrapper[19170]: E0313 01:20:52.341871 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 13 01:20:52.341988 master-0 kubenswrapper[19170]: I0313 01:20:52.341894 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 13 01:20:52.342146 master-0 kubenswrapper[19170]: E0313 01:20:52.342036 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 13 01:20:52.342146 master-0 kubenswrapper[19170]: I0313 01:20:52.342064 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 13 01:20:52.342254 master-0 kubenswrapper[19170]: E0313 01:20:52.342164 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:20:52.342254 master-0 kubenswrapper[19170]: I0313 01:20:52.342230 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:20:52.342368 master-0 kubenswrapper[19170]: E0313 01:20:52.342263 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 13 01:20:52.342368 master-0 kubenswrapper[19170]: I0313 01:20:52.342331 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.342816 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.342967 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.343001 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.343077 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.343104 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 13 01:20:52.344329 master-0 kubenswrapper[19170]: I0313 01:20:52.343174 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386116 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386196 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386243 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386275 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386375 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386418 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386455 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.389262 master-0 kubenswrapper[19170]: I0313 01:20:52.386494 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.455097 master-0 kubenswrapper[19170]: E0313 01:20:52.455030 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.486955 master-0 kubenswrapper[19170]: I0313 01:20:52.486911 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487091 master-0 kubenswrapper[19170]: I0313 01:20:52.487047 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487162 master-0 kubenswrapper[19170]: I0313 01:20:52.487147 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.487253 master-0 kubenswrapper[19170]: I0313 01:20:52.487233 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487343 master-0 kubenswrapper[19170]: I0313 01:20:52.487310 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.487398 master-0 kubenswrapper[19170]: I0313 01:20:52.487328 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487466 master-0 kubenswrapper[19170]: I0313 01:20:52.487439 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487536 master-0 kubenswrapper[19170]: I0313 01:20:52.487432 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487676 master-0 kubenswrapper[19170]: I0313 01:20:52.487627 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487746 master-0 kubenswrapper[19170]: I0313 01:20:52.487732 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.487850 master-0 kubenswrapper[19170]: I0313 01:20:52.487839 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.487937 master-0 kubenswrapper[19170]: I0313 01:20:52.487925 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.488012 master-0 kubenswrapper[19170]: I0313 01:20:52.488000 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.488097 master-0 kubenswrapper[19170]: I0313 01:20:52.488024 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.488171 master-0 kubenswrapper[19170]: I0313 01:20:52.487887 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:52.488230 master-0 kubenswrapper[19170]: I0313 01:20:52.488054 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.527080 master-0 kubenswrapper[19170]: E0313 01:20:52.526957 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-conmon-f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-conmon-e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod33ec6670_5dff_4e77_8ea1_2ca0e3538a0d.slice/crio-93070398078caeb2ec5143258964a2454e6befcc698ae218156e7b1ae60be571.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:20:52.756285 master-0 kubenswrapper[19170]: I0313 01:20:52.756214 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:52.790359 master-0 kubenswrapper[19170]: W0313 01:20:52.790279 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899242a15b2bdf3b4a04fb323647ca94.slice/crio-59fd7cf6489843aebd6cd289d8f69e3ffd0a20b745befcbc4674967786e5378c WatchSource:0}: Error finding container 59fd7cf6489843aebd6cd289d8f69e3ffd0a20b745befcbc4674967786e5378c: Status 404 returned error can't find the container with id 59fd7cf6489843aebd6cd289d8f69e3ffd0a20b745befcbc4674967786e5378c Mar 13 01:20:52.869273 master-0 kubenswrapper[19170]: I0313 01:20:52.869176 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 13 01:20:52.870769 master-0 kubenswrapper[19170]: I0313 01:20:52.870711 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d" exitCode=0 Mar 13 01:20:52.871039 master-0 kubenswrapper[19170]: I0313 01:20:52.871002 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a" exitCode=0 Mar 13 01:20:52.871039 master-0 kubenswrapper[19170]: I0313 01:20:52.871028 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a" exitCode=0 Mar 13 01:20:52.871217 master-0 kubenswrapper[19170]: I0313 01:20:52.871044 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285" exitCode=2 Mar 13 01:20:52.873238 master-0 kubenswrapper[19170]: I0313 01:20:52.873173 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"59fd7cf6489843aebd6cd289d8f69e3ffd0a20b745befcbc4674967786e5378c"} Mar 13 01:20:52.876068 master-0 kubenswrapper[19170]: I0313 01:20:52.876018 19170 generic.go:334] "Generic (PLEG): container finished" podID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" containerID="93070398078caeb2ec5143258964a2454e6befcc698ae218156e7b1ae60be571" exitCode=0 Mar 13 01:20:52.876262 master-0 kubenswrapper[19170]: I0313 01:20:52.876114 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d","Type":"ContainerDied","Data":"93070398078caeb2ec5143258964a2454e6befcc698ae218156e7b1ae60be571"} Mar 13 01:20:52.877881 master-0 kubenswrapper[19170]: I0313 01:20:52.877814 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:52.878796 master-0 kubenswrapper[19170]: I0313 01:20:52.878748 19170 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:53.888045 master-0 kubenswrapper[19170]: I0313 01:20:53.887988 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"235a13b04b9dbc747e5d8d2d8403d2908c7307c11854d776d6ea514a70233940"} Mar 13 01:20:53.890376 master-0 kubenswrapper[19170]: E0313 01:20:53.890302 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:53.890469 master-0 kubenswrapper[19170]: I0313 01:20:53.890300 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.284728 master-0 kubenswrapper[19170]: I0313 01:20:54.284579 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:20:54.286156 master-0 kubenswrapper[19170]: I0313 01:20:54.286019 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.440743 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock\") pod \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.440781 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir\") pod \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.440817 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access\") pod \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\" (UID: \"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d\") " Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.441065 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock" (OuterVolumeSpecName: "var-lock") pod "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" (UID: "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.441154 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.441510 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" (UID: "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:54.453988 master-0 kubenswrapper[19170]: I0313 01:20:54.450024 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" (UID: "33ec6670-5dff-4e77-8ea1-2ca0e3538a0d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:20:54.542134 master-0 kubenswrapper[19170]: I0313 01:20:54.542016 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.542134 master-0 kubenswrapper[19170]: I0313 01:20:54.542055 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33ec6670-5dff-4e77-8ea1-2ca0e3538a0d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.738333 master-0 kubenswrapper[19170]: I0313 01:20:54.738273 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 13 01:20:54.739574 master-0 kubenswrapper[19170]: I0313 01:20:54.739541 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:54.741112 master-0 kubenswrapper[19170]: I0313 01:20:54.741011 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.742171 master-0 kubenswrapper[19170]: I0313 01:20:54.742108 19170 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.846183 master-0 kubenswrapper[19170]: I0313 01:20:54.846053 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 13 01:20:54.846183 master-0 kubenswrapper[19170]: I0313 01:20:54.846131 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 13 01:20:54.846471 master-0 kubenswrapper[19170]: I0313 01:20:54.846213 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:54.846471 master-0 kubenswrapper[19170]: I0313 01:20:54.846327 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 13 01:20:54.846471 master-0 kubenswrapper[19170]: I0313 01:20:54.846344 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:54.846471 master-0 kubenswrapper[19170]: I0313 01:20:54.846450 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:20:54.846785 master-0 kubenswrapper[19170]: I0313 01:20:54.846750 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.846828 master-0 kubenswrapper[19170]: I0313 01:20:54.846785 19170 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.846828 master-0 kubenswrapper[19170]: I0313 01:20:54.846807 19170 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:20:54.901354 master-0 kubenswrapper[19170]: I0313 01:20:54.901270 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 13 01:20:54.902491 master-0 kubenswrapper[19170]: I0313 01:20:54.902424 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766" exitCode=0 Mar 13 01:20:54.902557 master-0 kubenswrapper[19170]: I0313 01:20:54.902513 19170 scope.go:117] "RemoveContainer" containerID="e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d" Mar 13 01:20:54.902663 master-0 kubenswrapper[19170]: I0313 01:20:54.902594 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:20:54.905700 master-0 kubenswrapper[19170]: I0313 01:20:54.905295 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"33ec6670-5dff-4e77-8ea1-2ca0e3538a0d","Type":"ContainerDied","Data":"b38bde1c90fe45f7f2c5d512e526862834d38953f345f5516890ee34a8b7f4be"} Mar 13 01:20:54.905700 master-0 kubenswrapper[19170]: I0313 01:20:54.905349 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 01:20:54.905700 master-0 kubenswrapper[19170]: I0313 01:20:54.905356 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38bde1c90fe45f7f2c5d512e526862834d38953f345f5516890ee34a8b7f4be" Mar 13 01:20:54.906712 master-0 kubenswrapper[19170]: E0313 01:20:54.906667 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:20:54.922156 master-0 kubenswrapper[19170]: I0313 01:20:54.922103 19170 scope.go:117] "RemoveContainer" containerID="2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a" Mar 13 01:20:54.949182 master-0 kubenswrapper[19170]: I0313 01:20:54.949105 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.951239 master-0 kubenswrapper[19170]: I0313 01:20:54.950348 19170 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.951239 master-0 kubenswrapper[19170]: I0313 01:20:54.950864 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.951536 master-0 kubenswrapper[19170]: I0313 01:20:54.951429 19170 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:54.960587 master-0 kubenswrapper[19170]: I0313 01:20:54.960448 19170 scope.go:117] "RemoveContainer" containerID="208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a" Mar 13 01:20:54.981981 master-0 kubenswrapper[19170]: I0313 01:20:54.981905 19170 scope.go:117] "RemoveContainer" containerID="f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285" Mar 13 01:20:55.005249 master-0 kubenswrapper[19170]: I0313 01:20:55.005190 19170 scope.go:117] "RemoveContainer" containerID="f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766" Mar 13 01:20:55.031004 master-0 kubenswrapper[19170]: I0313 01:20:55.030938 19170 scope.go:117] "RemoveContainer" containerID="6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624" Mar 13 01:20:55.058244 master-0 kubenswrapper[19170]: I0313 01:20:55.058179 19170 scope.go:117] "RemoveContainer" containerID="e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d" Mar 13 01:20:55.058979 master-0 kubenswrapper[19170]: E0313 01:20:55.058907 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d\": container with ID starting with e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d not found: ID does not exist" containerID="e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d" Mar 13 01:20:55.059118 master-0 kubenswrapper[19170]: I0313 01:20:55.058973 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d"} err="failed to get container status \"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d\": rpc error: code = NotFound desc = could not find container \"e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d\": container with ID starting with e5dcff415884960d9218f44f945518a8b20cab930c49510412686808bb43a09d not found: ID does not exist" Mar 13 01:20:55.059118 master-0 kubenswrapper[19170]: I0313 01:20:55.059014 19170 scope.go:117] "RemoveContainer" containerID="2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a" Mar 13 01:20:55.059627 master-0 kubenswrapper[19170]: E0313 01:20:55.059554 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a\": container with ID starting with 2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a not found: ID does not exist" containerID="2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a" Mar 13 01:20:55.059764 master-0 kubenswrapper[19170]: I0313 01:20:55.059613 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a"} err="failed to get container status \"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a\": rpc error: code = NotFound desc = could not find container \"2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a\": container with ID starting with 2463d467448269901f4fcbebf08a1f5694b763780f210069e6a1981a0f7ca79a not found: ID does not exist" Mar 13 01:20:55.059764 master-0 kubenswrapper[19170]: I0313 01:20:55.059683 19170 scope.go:117] "RemoveContainer" containerID="208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a" Mar 13 01:20:55.060283 master-0 kubenswrapper[19170]: E0313 01:20:55.060211 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a\": container with ID starting with 208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a not found: ID does not exist" containerID="208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a" Mar 13 01:20:55.060420 master-0 kubenswrapper[19170]: I0313 01:20:55.060291 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a"} err="failed to get container status \"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a\": rpc error: code = NotFound desc = could not find container \"208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a\": container with ID starting with 208cb89282bf1ed747c2d5d200ffda0b8e2d1d13703537447889ef8c2f67cf1a not found: ID does not exist" Mar 13 01:20:55.060420 master-0 kubenswrapper[19170]: I0313 01:20:55.060359 19170 scope.go:117] "RemoveContainer" containerID="f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285" Mar 13 01:20:55.061129 master-0 kubenswrapper[19170]: E0313 01:20:55.061066 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285\": container with ID starting with f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285 not found: ID does not exist" containerID="f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285" Mar 13 01:20:55.061267 master-0 kubenswrapper[19170]: I0313 01:20:55.061119 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285"} err="failed to get container status \"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285\": rpc error: code = NotFound desc = could not find container \"f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285\": container with ID starting with f494a23b4d1580cff31d9268d6fdc726455248c951c34158f5755806baa34285 not found: ID does not exist" Mar 13 01:20:55.061267 master-0 kubenswrapper[19170]: I0313 01:20:55.061154 19170 scope.go:117] "RemoveContainer" containerID="f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766" Mar 13 01:20:55.061680 master-0 kubenswrapper[19170]: E0313 01:20:55.061584 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766\": container with ID starting with f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766 not found: ID does not exist" containerID="f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766" Mar 13 01:20:55.061680 master-0 kubenswrapper[19170]: I0313 01:20:55.061657 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766"} err="failed to get container status \"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766\": rpc error: code = NotFound desc = could not find container \"f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766\": container with ID starting with f8cf1a80a339c43b2198f0c2a2cdc38630309c5c4e5c0430f6bc0266b483c766 not found: ID does not exist" Mar 13 01:20:55.061864 master-0 kubenswrapper[19170]: I0313 01:20:55.061686 19170 scope.go:117] "RemoveContainer" containerID="6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624" Mar 13 01:20:55.062501 master-0 kubenswrapper[19170]: E0313 01:20:55.062435 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624\": container with ID starting with 6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624 not found: ID does not exist" containerID="6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624" Mar 13 01:20:55.062625 master-0 kubenswrapper[19170]: I0313 01:20:55.062485 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624"} err="failed to get container status \"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624\": rpc error: code = NotFound desc = could not find container \"6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624\": container with ID starting with 6504d7e301bf358b754655b91d4ab759cf0c2f8f3e948397175f8d84bd27b624 not found: ID does not exist" Mar 13 01:20:55.434431 master-0 kubenswrapper[19170]: I0313 01:20:55.434357 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 13 01:20:57.393470 master-0 kubenswrapper[19170]: E0313 01:20:57.393164 19170 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189c41f000deea4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Killing,Message:Stopping container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:20:52.33969825 +0000 UTC m=+113.147819250,LastTimestamp:2026-03-13 01:20:52.33969825 +0000 UTC m=+113.147819250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:20:59.424889 master-0 kubenswrapper[19170]: I0313 01:20:59.424663 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:20:59.944141 master-0 kubenswrapper[19170]: I0313 01:20:59.944035 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:20:59.944141 master-0 kubenswrapper[19170]: I0313 01:20:59.944136 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:01.411110 master-0 kubenswrapper[19170]: I0313 01:21:01.410997 19170 scope.go:117] "RemoveContainer" containerID="2feba70148d30ccf8b16cda1bcbd40be5871412538af7a3c70d1cbdb9b96ea4e" Mar 13 01:21:01.434345 master-0 kubenswrapper[19170]: I0313 01:21:01.434288 19170 scope.go:117] "RemoveContainer" containerID="cabc0d0daac0ff5b74f3e06882a4fbae2aaadefec9cc5e2009027b89d0897c41" Mar 13 01:21:01.982319 master-0 kubenswrapper[19170]: E0313 01:21:01.982229 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:01.983600 master-0 kubenswrapper[19170]: E0313 01:21:01.983548 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:01.985129 master-0 kubenswrapper[19170]: E0313 01:21:01.985047 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:01.986106 master-0 kubenswrapper[19170]: E0313 01:21:01.986027 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:01.987347 master-0 kubenswrapper[19170]: E0313 01:21:01.987254 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:01.987555 master-0 kubenswrapper[19170]: I0313 01:21:01.987527 19170 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 01:21:01.988871 master-0 kubenswrapper[19170]: E0313 01:21:01.988807 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 01:21:02.190682 master-0 kubenswrapper[19170]: E0313 01:21:02.190554 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 01:21:02.260570 master-0 kubenswrapper[19170]: I0313 01:21:02.260386 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:02.260570 master-0 kubenswrapper[19170]: I0313 01:21:02.260470 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:02.591620 master-0 kubenswrapper[19170]: E0313 01:21:02.591484 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 01:21:02.892420 master-0 kubenswrapper[19170]: E0313 01:21:02.891924 19170 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.189c41f000deea4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:cdcecc61ff5eeb08bd2a3ac12599e4f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Killing,Message:Stopping container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:20:52.33969825 +0000 UTC m=+113.147819250,LastTimestamp:2026-03-13 01:20:52.33969825 +0000 UTC m=+113.147819250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:21:02.982021 master-0 kubenswrapper[19170]: I0313 01:21:02.981936 19170 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9" exitCode=1 Mar 13 01:21:02.982021 master-0 kubenswrapper[19170]: I0313 01:21:02.982013 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9"} Mar 13 01:21:02.982409 master-0 kubenswrapper[19170]: I0313 01:21:02.982080 19170 scope.go:117] "RemoveContainer" containerID="d17962d23b6425584e2488789adc4521ea90d5d79405cef63024d09b7df17fd9" Mar 13 01:21:02.982817 master-0 kubenswrapper[19170]: I0313 01:21:02.982778 19170 scope.go:117] "RemoveContainer" containerID="d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9" Mar 13 01:21:02.984511 master-0 kubenswrapper[19170]: I0313 01:21:02.984077 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:02.985249 master-0 kubenswrapper[19170]: I0313 01:21:02.985185 19170 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:03.393884 master-0 kubenswrapper[19170]: E0313 01:21:03.393802 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 01:21:03.991190 master-0 kubenswrapper[19170]: I0313 01:21:03.991114 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510"} Mar 13 01:21:03.992285 master-0 kubenswrapper[19170]: I0313 01:21:03.992199 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:03.995062 master-0 kubenswrapper[19170]: I0313 01:21:03.994995 19170 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:04.995108 master-0 kubenswrapper[19170]: E0313 01:21:04.995015 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 01:21:05.418814 master-0 kubenswrapper[19170]: I0313 01:21:05.418730 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:05.421129 master-0 kubenswrapper[19170]: I0313 01:21:05.421041 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:05.422449 master-0 kubenswrapper[19170]: I0313 01:21:05.422325 19170 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:05.452756 master-0 kubenswrapper[19170]: I0313 01:21:05.452621 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:05.452756 master-0 kubenswrapper[19170]: I0313 01:21:05.452713 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:05.453975 master-0 kubenswrapper[19170]: E0313 01:21:05.453918 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:05.454783 master-0 kubenswrapper[19170]: I0313 01:21:05.454734 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:05.486809 master-0 kubenswrapper[19170]: W0313 01:21:05.486246 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077dd10388b9e3e48a07382126e86621.slice/crio-c6bd0bf1e8d098710971bb56839339f5970bd24045f07fe7749fa9dbdb8e2081 WatchSource:0}: Error finding container c6bd0bf1e8d098710971bb56839339f5970bd24045f07fe7749fa9dbdb8e2081: Status 404 returned error can't find the container with id c6bd0bf1e8d098710971bb56839339f5970bd24045f07fe7749fa9dbdb8e2081 Mar 13 01:21:06.017734 master-0 kubenswrapper[19170]: I0313 01:21:06.017548 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa" exitCode=0 Mar 13 01:21:06.017734 master-0 kubenswrapper[19170]: I0313 01:21:06.017601 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa"} Mar 13 01:21:06.017734 master-0 kubenswrapper[19170]: I0313 01:21:06.017668 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"c6bd0bf1e8d098710971bb56839339f5970bd24045f07fe7749fa9dbdb8e2081"} Mar 13 01:21:06.018573 master-0 kubenswrapper[19170]: I0313 01:21:06.017986 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:06.018573 master-0 kubenswrapper[19170]: I0313 01:21:06.018002 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:06.019153 master-0 kubenswrapper[19170]: I0313 01:21:06.019073 19170 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:06.019468 master-0 kubenswrapper[19170]: E0313 01:21:06.019107 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:06.021897 master-0 kubenswrapper[19170]: I0313 01:21:06.021845 19170 status_manager.go:851] "Failed to get status for pod" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:21:07.039209 master-0 kubenswrapper[19170]: I0313 01:21:07.038815 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7"} Mar 13 01:21:07.039209 master-0 kubenswrapper[19170]: I0313 01:21:07.038880 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199"} Mar 13 01:21:07.048307 master-0 kubenswrapper[19170]: I0313 01:21:07.048104 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/0.log" Mar 13 01:21:07.048307 master-0 kubenswrapper[19170]: I0313 01:21:07.048187 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116" exitCode=137 Mar 13 01:21:07.048307 master-0 kubenswrapper[19170]: I0313 01:21:07.048241 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116"} Mar 13 01:21:08.055580 master-0 kubenswrapper[19170]: I0313 01:21:08.055533 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/0.log" Mar 13 01:21:08.056128 master-0 kubenswrapper[19170]: I0313 01:21:08.055842 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"46c26b717b497322454dcf7c105249ed590c5f0f850f5c9e1de33f73e6f55637"} Mar 13 01:21:08.058840 master-0 kubenswrapper[19170]: I0313 01:21:08.058809 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c"} Mar 13 01:21:08.058840 master-0 kubenswrapper[19170]: I0313 01:21:08.058840 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a"} Mar 13 01:21:08.058991 master-0 kubenswrapper[19170]: I0313 01:21:08.058852 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b"} Mar 13 01:21:08.058991 master-0 kubenswrapper[19170]: I0313 01:21:08.058954 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:08.059058 master-0 kubenswrapper[19170]: I0313 01:21:08.059030 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:08.059058 master-0 kubenswrapper[19170]: I0313 01:21:08.059047 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:09.944550 master-0 kubenswrapper[19170]: I0313 01:21:09.944424 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:09.945472 master-0 kubenswrapper[19170]: I0313 01:21:09.944578 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:10.455919 master-0 kubenswrapper[19170]: I0313 01:21:10.455827 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:10.455919 master-0 kubenswrapper[19170]: I0313 01:21:10.455928 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:10.463369 master-0 kubenswrapper[19170]: I0313 01:21:10.463282 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:12.261815 master-0 kubenswrapper[19170]: I0313 01:21:12.261717 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:12.261815 master-0 kubenswrapper[19170]: I0313 01:21:12.261802 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:13.076538 master-0 kubenswrapper[19170]: I0313 01:21:13.076447 19170 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:13.203804 master-0 kubenswrapper[19170]: I0313 01:21:13.203746 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:21:14.112811 master-0 kubenswrapper[19170]: I0313 01:21:14.112699 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:14.112811 master-0 kubenswrapper[19170]: I0313 01:21:14.112797 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="873bf025-bb37-49de-978b-81ffaaa173c7" Mar 13 01:21:14.116985 master-0 kubenswrapper[19170]: I0313 01:21:14.116934 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:21:16.482734 master-0 kubenswrapper[19170]: I0313 01:21:16.482624 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:21:16.483522 master-0 kubenswrapper[19170]: I0313 01:21:16.483250 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:21:16.490175 master-0 kubenswrapper[19170]: I0313 01:21:16.490090 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:21:17.144459 master-0 kubenswrapper[19170]: I0313 01:21:17.144374 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:21:19.944399 master-0 kubenswrapper[19170]: I0313 01:21:19.944326 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:19.945404 master-0 kubenswrapper[19170]: I0313 01:21:19.944402 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:22.259811 master-0 kubenswrapper[19170]: I0313 01:21:22.259739 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:22.260754 master-0 kubenswrapper[19170]: I0313 01:21:22.259820 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:22.602022 master-0 kubenswrapper[19170]: I0313 01:21:22.601865 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 01:21:23.230667 master-0 kubenswrapper[19170]: I0313 01:21:23.230413 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 01:21:23.290659 master-0 kubenswrapper[19170]: I0313 01:21:23.290537 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 01:21:23.314426 master-0 kubenswrapper[19170]: I0313 01:21:23.314358 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:21:24.022114 master-0 kubenswrapper[19170]: I0313 01:21:24.022052 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 01:21:24.644946 master-0 kubenswrapper[19170]: I0313 01:21:24.644866 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 01:21:24.689817 master-0 kubenswrapper[19170]: I0313 01:21:24.689747 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-lgs6j" Mar 13 01:21:24.843742 master-0 kubenswrapper[19170]: I0313 01:21:24.843620 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 01:21:24.945058 master-0 kubenswrapper[19170]: I0313 01:21:24.944991 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-b6b87" Mar 13 01:21:25.068237 master-0 kubenswrapper[19170]: I0313 01:21:25.068100 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 01:21:25.210192 master-0 kubenswrapper[19170]: I0313 01:21:25.210029 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 01:21:25.287165 master-0 kubenswrapper[19170]: I0313 01:21:25.287078 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 01:21:25.306431 master-0 kubenswrapper[19170]: I0313 01:21:25.306352 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 01:21:25.309743 master-0 kubenswrapper[19170]: I0313 01:21:25.309686 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 01:21:25.329180 master-0 kubenswrapper[19170]: I0313 01:21:25.329094 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 01:21:25.347766 master-0 kubenswrapper[19170]: I0313 01:21:25.347708 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 01:21:25.458034 master-0 kubenswrapper[19170]: I0313 01:21:25.457932 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 01:21:25.581070 master-0 kubenswrapper[19170]: I0313 01:21:25.580929 19170 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 01:21:25.872405 master-0 kubenswrapper[19170]: I0313 01:21:25.872225 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 01:21:25.976661 master-0 kubenswrapper[19170]: I0313 01:21:25.976542 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-bclr4" Mar 13 01:21:25.990987 master-0 kubenswrapper[19170]: I0313 01:21:25.990726 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 01:21:26.089121 master-0 kubenswrapper[19170]: I0313 01:21:26.089039 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:21:26.192835 master-0 kubenswrapper[19170]: I0313 01:21:26.192725 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 01:21:26.270063 master-0 kubenswrapper[19170]: I0313 01:21:26.269969 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:21:26.449947 master-0 kubenswrapper[19170]: I0313 01:21:26.449733 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 01:21:26.452865 master-0 kubenswrapper[19170]: I0313 01:21:26.452791 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-7wmj8" Mar 13 01:21:26.484845 master-0 kubenswrapper[19170]: I0313 01:21:26.484758 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 13 01:21:26.607359 master-0 kubenswrapper[19170]: I0313 01:21:26.607282 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:21:26.607742 master-0 kubenswrapper[19170]: I0313 01:21:26.607443 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 01:21:26.633027 master-0 kubenswrapper[19170]: I0313 01:21:26.632970 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 01:21:26.661380 master-0 kubenswrapper[19170]: I0313 01:21:26.661322 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-b1qe6h41gh39q" Mar 13 01:21:26.767406 master-0 kubenswrapper[19170]: I0313 01:21:26.767213 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 01:21:26.802575 master-0 kubenswrapper[19170]: I0313 01:21:26.802496 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 01:21:26.804430 master-0 kubenswrapper[19170]: I0313 01:21:26.804379 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 01:21:26.887977 master-0 kubenswrapper[19170]: I0313 01:21:26.887912 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 01:21:26.901563 master-0 kubenswrapper[19170]: I0313 01:21:26.901494 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 01:21:26.913726 master-0 kubenswrapper[19170]: I0313 01:21:26.913682 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 01:21:26.929785 master-0 kubenswrapper[19170]: I0313 01:21:26.929501 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 01:21:26.986186 master-0 kubenswrapper[19170]: I0313 01:21:26.986099 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 01:21:27.017338 master-0 kubenswrapper[19170]: I0313 01:21:27.017269 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 01:21:27.049017 master-0 kubenswrapper[19170]: I0313 01:21:27.047840 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:21:27.068755 master-0 kubenswrapper[19170]: I0313 01:21:27.067905 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:21:27.313970 master-0 kubenswrapper[19170]: I0313 01:21:27.313796 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:21:27.330339 master-0 kubenswrapper[19170]: I0313 01:21:27.330205 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 01:21:27.343437 master-0 kubenswrapper[19170]: I0313 01:21:27.343337 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 01:21:27.528319 master-0 kubenswrapper[19170]: I0313 01:21:27.528230 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 01:21:27.567469 master-0 kubenswrapper[19170]: I0313 01:21:27.567310 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 01:21:27.573283 master-0 kubenswrapper[19170]: I0313 01:21:27.573214 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 01:21:27.604264 master-0 kubenswrapper[19170]: I0313 01:21:27.604165 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-6h5r7" Mar 13 01:21:27.768768 master-0 kubenswrapper[19170]: I0313 01:21:27.768623 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 01:21:27.799125 master-0 kubenswrapper[19170]: I0313 01:21:27.799036 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-5qbrz" Mar 13 01:21:27.813821 master-0 kubenswrapper[19170]: I0313 01:21:27.813738 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 01:21:28.004875 master-0 kubenswrapper[19170]: I0313 01:21:28.004785 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 01:21:28.145564 master-0 kubenswrapper[19170]: I0313 01:21:28.145474 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 01:21:28.155084 master-0 kubenswrapper[19170]: I0313 01:21:28.155025 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 01:21:28.174367 master-0 kubenswrapper[19170]: I0313 01:21:28.174304 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-s8vsn" Mar 13 01:21:28.180841 master-0 kubenswrapper[19170]: I0313 01:21:28.180786 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 01:21:28.241286 master-0 kubenswrapper[19170]: I0313 01:21:28.240098 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 01:21:28.360912 master-0 kubenswrapper[19170]: I0313 01:21:28.360711 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 01:21:28.362954 master-0 kubenswrapper[19170]: I0313 01:21:28.362876 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 01:21:28.397793 master-0 kubenswrapper[19170]: I0313 01:21:28.397703 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-z62g7" Mar 13 01:21:28.521780 master-0 kubenswrapper[19170]: I0313 01:21:28.521043 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 01:21:28.558072 master-0 kubenswrapper[19170]: I0313 01:21:28.557337 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 01:21:28.686624 master-0 kubenswrapper[19170]: I0313 01:21:28.686531 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 01:21:28.725173 master-0 kubenswrapper[19170]: I0313 01:21:28.725107 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 01:21:28.747266 master-0 kubenswrapper[19170]: I0313 01:21:28.747219 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 01:21:28.921847 master-0 kubenswrapper[19170]: I0313 01:21:28.921810 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 01:21:28.944871 master-0 kubenswrapper[19170]: I0313 01:21:28.944769 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 01:21:28.993362 master-0 kubenswrapper[19170]: I0313 01:21:28.991964 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 01:21:28.997803 master-0 kubenswrapper[19170]: I0313 01:21:28.997777 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 01:21:29.003058 master-0 kubenswrapper[19170]: I0313 01:21:29.003035 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 01:21:29.097936 master-0 kubenswrapper[19170]: I0313 01:21:29.097885 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 01:21:29.143727 master-0 kubenswrapper[19170]: I0313 01:21:29.143695 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 01:21:29.196007 master-0 kubenswrapper[19170]: I0313 01:21:29.195915 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 01:21:29.248576 master-0 kubenswrapper[19170]: I0313 01:21:29.248523 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 01:21:29.432448 master-0 kubenswrapper[19170]: I0313 01:21:29.432394 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-m57n6" Mar 13 01:21:29.450071 master-0 kubenswrapper[19170]: I0313 01:21:29.449489 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-ps7fb" Mar 13 01:21:29.460775 master-0 kubenswrapper[19170]: I0313 01:21:29.458132 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 01:21:29.656504 master-0 kubenswrapper[19170]: I0313 01:21:29.656458 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 01:21:29.665526 master-0 kubenswrapper[19170]: I0313 01:21:29.665483 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 01:21:29.685426 master-0 kubenswrapper[19170]: I0313 01:21:29.685366 19170 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 01:21:29.686456 master-0 kubenswrapper[19170]: I0313 01:21:29.686437 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 01:21:29.721527 master-0 kubenswrapper[19170]: I0313 01:21:29.721418 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 01:21:29.791970 master-0 kubenswrapper[19170]: I0313 01:21:29.791909 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 01:21:29.907505 master-0 kubenswrapper[19170]: I0313 01:21:29.907442 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:21:29.944042 master-0 kubenswrapper[19170]: I0313 01:21:29.943959 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:29.944221 master-0 kubenswrapper[19170]: I0313 01:21:29.944048 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:29.997837 master-0 kubenswrapper[19170]: I0313 01:21:29.997626 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:21:30.074291 master-0 kubenswrapper[19170]: I0313 01:21:30.074194 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 01:21:30.075675 master-0 kubenswrapper[19170]: I0313 01:21:30.075588 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-c7thh" Mar 13 01:21:30.181411 master-0 kubenswrapper[19170]: I0313 01:21:30.181360 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 01:21:30.185077 master-0 kubenswrapper[19170]: I0313 01:21:30.185021 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 01:21:30.219347 master-0 kubenswrapper[19170]: I0313 01:21:30.219286 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 01:21:30.271490 master-0 kubenswrapper[19170]: I0313 01:21:30.271330 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 01:21:30.323114 master-0 kubenswrapper[19170]: I0313 01:21:30.323038 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 01:21:30.360586 master-0 kubenswrapper[19170]: I0313 01:21:30.360501 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 01:21:30.362431 master-0 kubenswrapper[19170]: I0313 01:21:30.362386 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 01:21:30.366877 master-0 kubenswrapper[19170]: I0313 01:21:30.366820 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 01:21:30.373668 master-0 kubenswrapper[19170]: I0313 01:21:30.373595 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 01:21:30.434177 master-0 kubenswrapper[19170]: I0313 01:21:30.434134 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 01:21:30.458365 master-0 kubenswrapper[19170]: I0313 01:21:30.458311 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 01:21:30.473232 master-0 kubenswrapper[19170]: I0313 01:21:30.473191 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 01:21:30.582043 master-0 kubenswrapper[19170]: I0313 01:21:30.580247 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 01:21:30.615019 master-0 kubenswrapper[19170]: I0313 01:21:30.614928 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 01:21:30.624580 master-0 kubenswrapper[19170]: I0313 01:21:30.624475 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 01:21:30.701520 master-0 kubenswrapper[19170]: I0313 01:21:30.701445 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 01:21:30.704281 master-0 kubenswrapper[19170]: I0313 01:21:30.704220 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:21:30.722380 master-0 kubenswrapper[19170]: I0313 01:21:30.722326 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 01:21:30.742713 master-0 kubenswrapper[19170]: I0313 01:21:30.742662 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 01:21:30.866259 master-0 kubenswrapper[19170]: I0313 01:21:30.866080 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 01:21:30.875527 master-0 kubenswrapper[19170]: I0313 01:21:30.875468 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-rq7qf" Mar 13 01:21:30.892818 master-0 kubenswrapper[19170]: I0313 01:21:30.892771 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 01:21:30.896456 master-0 kubenswrapper[19170]: I0313 01:21:30.896392 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:21:30.904158 master-0 kubenswrapper[19170]: I0313 01:21:30.904105 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:21:30.913137 master-0 kubenswrapper[19170]: I0313 01:21:30.913101 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 01:21:30.915986 master-0 kubenswrapper[19170]: I0313 01:21:30.915942 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 01:21:30.927838 master-0 kubenswrapper[19170]: I0313 01:21:30.927752 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 01:21:30.956595 master-0 kubenswrapper[19170]: I0313 01:21:30.956559 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 01:21:31.052866 master-0 kubenswrapper[19170]: I0313 01:21:31.052798 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 01:21:31.067686 master-0 kubenswrapper[19170]: I0313 01:21:31.067573 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 01:21:31.158476 master-0 kubenswrapper[19170]: I0313 01:21:31.158318 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 01:21:31.221109 master-0 kubenswrapper[19170]: I0313 01:21:31.221040 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 01:21:31.242131 master-0 kubenswrapper[19170]: I0313 01:21:31.242085 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-2pmf7" Mar 13 01:21:31.254994 master-0 kubenswrapper[19170]: I0313 01:21:31.254915 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:21:31.274027 master-0 kubenswrapper[19170]: I0313 01:21:31.273978 19170 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 01:21:31.342414 master-0 kubenswrapper[19170]: I0313 01:21:31.342329 19170 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 01:21:31.371578 master-0 kubenswrapper[19170]: I0313 01:21:31.371526 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 01:21:31.382337 master-0 kubenswrapper[19170]: I0313 01:21:31.382271 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-5fw2q" Mar 13 01:21:31.400377 master-0 kubenswrapper[19170]: I0313 01:21:31.400303 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 01:21:31.645307 master-0 kubenswrapper[19170]: I0313 01:21:31.645218 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 01:21:31.693491 master-0 kubenswrapper[19170]: I0313 01:21:31.693437 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 01:21:31.702010 master-0 kubenswrapper[19170]: I0313 01:21:31.701954 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 01:21:31.715670 master-0 kubenswrapper[19170]: I0313 01:21:31.715588 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:21:31.804030 master-0 kubenswrapper[19170]: I0313 01:21:31.803985 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 01:21:31.923561 master-0 kubenswrapper[19170]: I0313 01:21:31.923514 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 01:21:31.984270 master-0 kubenswrapper[19170]: I0313 01:21:31.984199 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 01:21:31.991827 master-0 kubenswrapper[19170]: I0313 01:21:31.991765 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:21:32.124145 master-0 kubenswrapper[19170]: I0313 01:21:32.124066 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:21:32.125903 master-0 kubenswrapper[19170]: I0313 01:21:32.125838 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 01:21:32.168765 master-0 kubenswrapper[19170]: I0313 01:21:32.168696 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:21:32.207271 master-0 kubenswrapper[19170]: I0313 01:21:32.207157 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 01:21:32.244357 master-0 kubenswrapper[19170]: I0313 01:21:32.244309 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 01:21:32.259858 master-0 kubenswrapper[19170]: I0313 01:21:32.259805 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:32.260024 master-0 kubenswrapper[19170]: I0313 01:21:32.259870 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:32.309121 master-0 kubenswrapper[19170]: I0313 01:21:32.309026 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 01:21:32.369611 master-0 kubenswrapper[19170]: I0313 01:21:32.369537 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:21:32.384003 master-0 kubenswrapper[19170]: I0313 01:21:32.383918 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 01:21:32.437046 master-0 kubenswrapper[19170]: I0313 01:21:32.437002 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 01:21:32.469695 master-0 kubenswrapper[19170]: I0313 01:21:32.469580 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 01:21:32.473466 master-0 kubenswrapper[19170]: I0313 01:21:32.473412 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 01:21:32.516910 master-0 kubenswrapper[19170]: I0313 01:21:32.516857 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 01:21:32.584200 master-0 kubenswrapper[19170]: I0313 01:21:32.584156 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 01:21:32.661491 master-0 kubenswrapper[19170]: I0313 01:21:32.661453 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 01:21:32.798483 master-0 kubenswrapper[19170]: I0313 01:21:32.798314 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 01:21:32.817686 master-0 kubenswrapper[19170]: I0313 01:21:32.817585 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 01:21:32.825541 master-0 kubenswrapper[19170]: I0313 01:21:32.825463 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 01:21:32.835224 master-0 kubenswrapper[19170]: I0313 01:21:32.834390 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 01:21:32.951847 master-0 kubenswrapper[19170]: I0313 01:21:32.951725 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 01:21:33.087460 master-0 kubenswrapper[19170]: I0313 01:21:33.087284 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:21:33.098297 master-0 kubenswrapper[19170]: I0313 01:21:33.098218 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 01:21:33.136361 master-0 kubenswrapper[19170]: I0313 01:21:33.136281 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-tkjm8" Mar 13 01:21:33.154205 master-0 kubenswrapper[19170]: I0313 01:21:33.154127 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 01:21:33.205222 master-0 kubenswrapper[19170]: I0313 01:21:33.205154 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 01:21:33.321958 master-0 kubenswrapper[19170]: I0313 01:21:33.321822 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qdr99" Mar 13 01:21:33.334101 master-0 kubenswrapper[19170]: I0313 01:21:33.334037 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 01:21:33.348193 master-0 kubenswrapper[19170]: I0313 01:21:33.348075 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 01:21:33.383538 master-0 kubenswrapper[19170]: I0313 01:21:33.383467 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 01:21:33.384999 master-0 kubenswrapper[19170]: I0313 01:21:33.384944 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:21:33.417874 master-0 kubenswrapper[19170]: I0313 01:21:33.417776 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 01:21:33.448100 master-0 kubenswrapper[19170]: I0313 01:21:33.448047 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 01:21:33.520606 master-0 kubenswrapper[19170]: I0313 01:21:33.520535 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 01:21:33.564137 master-0 kubenswrapper[19170]: I0313 01:21:33.564064 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 01:21:33.660035 master-0 kubenswrapper[19170]: I0313 01:21:33.659882 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 01:21:33.832568 master-0 kubenswrapper[19170]: I0313 01:21:33.832498 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 01:21:33.876290 master-0 kubenswrapper[19170]: I0313 01:21:33.876216 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:21:33.887797 master-0 kubenswrapper[19170]: I0313 01:21:33.887715 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 01:21:33.905521 master-0 kubenswrapper[19170]: I0313 01:21:33.905451 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:21:33.914238 master-0 kubenswrapper[19170]: I0313 01:21:33.914178 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 01:21:33.929293 master-0 kubenswrapper[19170]: I0313 01:21:33.929211 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 01:21:33.938153 master-0 kubenswrapper[19170]: I0313 01:21:33.938109 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 01:21:33.960287 master-0 kubenswrapper[19170]: I0313 01:21:33.960212 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 01:21:34.016387 master-0 kubenswrapper[19170]: I0313 01:21:34.016318 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 01:21:34.217796 master-0 kubenswrapper[19170]: I0313 01:21:34.217603 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6f5hw" Mar 13 01:21:34.245543 master-0 kubenswrapper[19170]: I0313 01:21:34.245474 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6b6t5" Mar 13 01:21:34.261169 master-0 kubenswrapper[19170]: I0313 01:21:34.261091 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 01:21:34.271798 master-0 kubenswrapper[19170]: I0313 01:21:34.271520 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 01:21:34.287790 master-0 kubenswrapper[19170]: I0313 01:21:34.287743 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 01:21:34.357989 master-0 kubenswrapper[19170]: I0313 01:21:34.357930 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-gtb4f" Mar 13 01:21:34.388236 master-0 kubenswrapper[19170]: I0313 01:21:34.388167 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 01:21:34.402261 master-0 kubenswrapper[19170]: I0313 01:21:34.402212 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 01:21:34.411247 master-0 kubenswrapper[19170]: I0313 01:21:34.411210 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 01:21:34.534452 master-0 kubenswrapper[19170]: I0313 01:21:34.534330 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 01:21:34.608933 master-0 kubenswrapper[19170]: I0313 01:21:34.608816 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 01:21:34.609479 master-0 kubenswrapper[19170]: I0313 01:21:34.609376 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 01:21:34.613250 master-0 kubenswrapper[19170]: I0313 01:21:34.613195 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 01:21:34.666272 master-0 kubenswrapper[19170]: I0313 01:21:34.666195 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 01:21:34.675982 master-0 kubenswrapper[19170]: I0313 01:21:34.675937 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4lcm6" Mar 13 01:21:34.683908 master-0 kubenswrapper[19170]: I0313 01:21:34.683850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 01:21:34.726125 master-0 kubenswrapper[19170]: I0313 01:21:34.726048 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 01:21:34.777470 master-0 kubenswrapper[19170]: I0313 01:21:34.777396 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 01:21:34.947245 master-0 kubenswrapper[19170]: I0313 01:21:34.947172 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 01:21:35.048654 master-0 kubenswrapper[19170]: I0313 01:21:35.048541 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:21:35.081925 master-0 kubenswrapper[19170]: I0313 01:21:35.081823 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:21:35.122738 master-0 kubenswrapper[19170]: I0313 01:21:35.121577 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 01:21:35.177257 master-0 kubenswrapper[19170]: I0313 01:21:35.177192 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 01:21:35.217553 master-0 kubenswrapper[19170]: I0313 01:21:35.217386 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 01:21:35.238105 master-0 kubenswrapper[19170]: I0313 01:21:35.238009 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 01:21:35.366753 master-0 kubenswrapper[19170]: I0313 01:21:35.366628 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 01:21:35.430772 master-0 kubenswrapper[19170]: I0313 01:21:35.430706 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 01:21:35.516116 master-0 kubenswrapper[19170]: I0313 01:21:35.515943 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 01:21:35.560144 master-0 kubenswrapper[19170]: I0313 01:21:35.560059 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-56ljs" Mar 13 01:21:35.611256 master-0 kubenswrapper[19170]: I0313 01:21:35.611186 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 01:21:35.630818 master-0 kubenswrapper[19170]: I0313 01:21:35.630579 19170 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 01:21:35.645490 master-0 kubenswrapper[19170]: I0313 01:21:35.645399 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:21:35.645490 master-0 kubenswrapper[19170]: I0313 01:21:35.645495 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:21:35.652690 master-0 kubenswrapper[19170]: I0313 01:21:35.652109 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:35.652690 master-0 kubenswrapper[19170]: I0313 01:21:35.652607 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:21:35.658028 master-0 kubenswrapper[19170]: I0313 01:21:35.657990 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 01:21:35.676033 master-0 kubenswrapper[19170]: I0313 01:21:35.675618 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=22.675590847 podStartE2EDuration="22.675590847s" podCreationTimestamp="2026-03-13 01:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:21:35.670133317 +0000 UTC m=+156.478254337" watchObservedRunningTime="2026-03-13 01:21:35.675590847 +0000 UTC m=+156.483711827" Mar 13 01:21:35.688449 master-0 kubenswrapper[19170]: I0313 01:21:35.688383 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 01:21:35.716908 master-0 kubenswrapper[19170]: I0313 01:21:35.716849 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 01:21:35.731999 master-0 kubenswrapper[19170]: I0313 01:21:35.731962 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:21:35.737182 master-0 kubenswrapper[19170]: I0313 01:21:35.737144 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-qcwkf" Mar 13 01:21:35.738991 master-0 kubenswrapper[19170]: I0313 01:21:35.738966 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 01:21:35.786453 master-0 kubenswrapper[19170]: I0313 01:21:35.786337 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 01:21:35.860616 master-0 kubenswrapper[19170]: I0313 01:21:35.860540 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 01:21:35.961748 master-0 kubenswrapper[19170]: I0313 01:21:35.961616 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:21:36.026559 master-0 kubenswrapper[19170]: I0313 01:21:36.026487 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 01:21:36.052975 master-0 kubenswrapper[19170]: I0313 01:21:36.052789 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 01:21:36.083457 master-0 kubenswrapper[19170]: I0313 01:21:36.083363 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 01:21:36.106123 master-0 kubenswrapper[19170]: I0313 01:21:36.106048 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 01:21:36.114337 master-0 kubenswrapper[19170]: I0313 01:21:36.114270 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 01:21:36.129717 master-0 kubenswrapper[19170]: I0313 01:21:36.129670 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 01:21:36.343669 master-0 kubenswrapper[19170]: I0313 01:21:36.343507 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jf79b" Mar 13 01:21:36.373731 master-0 kubenswrapper[19170]: I0313 01:21:36.366356 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 01:21:36.474563 master-0 kubenswrapper[19170]: I0313 01:21:36.474475 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 01:21:36.533977 master-0 kubenswrapper[19170]: I0313 01:21:36.533929 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 01:21:36.679754 master-0 kubenswrapper[19170]: I0313 01:21:36.679688 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 01:21:36.694455 master-0 kubenswrapper[19170]: I0313 01:21:36.694409 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 01:21:36.761572 master-0 kubenswrapper[19170]: I0313 01:21:36.761493 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:21:36.771356 master-0 kubenswrapper[19170]: I0313 01:21:36.771295 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-bjp2n" Mar 13 01:21:36.837475 master-0 kubenswrapper[19170]: I0313 01:21:36.837392 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 01:21:36.934879 master-0 kubenswrapper[19170]: I0313 01:21:36.934707 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 01:21:36.938328 master-0 kubenswrapper[19170]: I0313 01:21:36.938276 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-h6wj2" Mar 13 01:21:36.981888 master-0 kubenswrapper[19170]: I0313 01:21:36.981801 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:21:36.992364 master-0 kubenswrapper[19170]: I0313 01:21:36.992285 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 01:21:37.012352 master-0 kubenswrapper[19170]: I0313 01:21:37.012271 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 01:21:37.235233 master-0 kubenswrapper[19170]: I0313 01:21:37.235087 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:21:37.313878 master-0 kubenswrapper[19170]: I0313 01:21:37.313788 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 01:21:37.398411 master-0 kubenswrapper[19170]: I0313 01:21:37.398353 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 01:21:37.444942 master-0 kubenswrapper[19170]: I0313 01:21:37.444883 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 01:21:37.477582 master-0 kubenswrapper[19170]: I0313 01:21:37.477528 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 01:21:37.556155 master-0 kubenswrapper[19170]: I0313 01:21:37.556066 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 01:21:37.571872 master-0 kubenswrapper[19170]: I0313 01:21:37.571817 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 01:21:37.582329 master-0 kubenswrapper[19170]: I0313 01:21:37.582282 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 01:21:37.582444 master-0 kubenswrapper[19170]: I0313 01:21:37.582348 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 01:21:37.652766 master-0 kubenswrapper[19170]: I0313 01:21:37.652684 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:21:37.690532 master-0 kubenswrapper[19170]: I0313 01:21:37.690484 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 01:21:37.714401 master-0 kubenswrapper[19170]: I0313 01:21:37.714350 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 01:21:37.762063 master-0 kubenswrapper[19170]: I0313 01:21:37.762003 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 01:21:37.838380 master-0 kubenswrapper[19170]: I0313 01:21:37.838203 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 01:21:37.885427 master-0 kubenswrapper[19170]: I0313 01:21:37.885372 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-zdrdj" Mar 13 01:21:37.897827 master-0 kubenswrapper[19170]: I0313 01:21:37.897778 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 01:21:37.902983 master-0 kubenswrapper[19170]: I0313 01:21:37.902942 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 01:21:37.911648 master-0 kubenswrapper[19170]: I0313 01:21:37.911591 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 01:21:38.026893 master-0 kubenswrapper[19170]: I0313 01:21:38.026836 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sb45h" Mar 13 01:21:38.118671 master-0 kubenswrapper[19170]: I0313 01:21:38.118487 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 01:21:38.281589 master-0 kubenswrapper[19170]: I0313 01:21:38.281529 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vdkmw" Mar 13 01:21:38.379901 master-0 kubenswrapper[19170]: I0313 01:21:38.379746 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 01:21:38.683446 master-0 kubenswrapper[19170]: I0313 01:21:38.683391 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 01:21:38.773903 master-0 kubenswrapper[19170]: I0313 01:21:38.773849 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 01:21:38.920972 master-0 kubenswrapper[19170]: I0313 01:21:38.920893 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 01:21:38.947140 master-0 kubenswrapper[19170]: I0313 01:21:38.946995 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-n4b44" Mar 13 01:21:39.299871 master-0 kubenswrapper[19170]: I0313 01:21:39.299709 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 01:21:39.303565 master-0 kubenswrapper[19170]: I0313 01:21:39.303473 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 01:21:39.324170 master-0 kubenswrapper[19170]: I0313 01:21:39.324112 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-7q6zv" Mar 13 01:21:39.943988 master-0 kubenswrapper[19170]: I0313 01:21:39.943915 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:39.944759 master-0 kubenswrapper[19170]: I0313 01:21:39.943998 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:40.840774 master-0 kubenswrapper[19170]: I0313 01:21:40.840704 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 01:21:41.081620 master-0 kubenswrapper[19170]: I0313 01:21:41.081470 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:21:41.423758 master-0 kubenswrapper[19170]: I0313 01:21:41.423705 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 01:21:41.424120 master-0 kubenswrapper[19170]: I0313 01:21:41.423715 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-sjhm5" Mar 13 01:21:41.632476 master-0 kubenswrapper[19170]: I0313 01:21:41.632396 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 01:21:42.260834 master-0 kubenswrapper[19170]: I0313 01:21:42.260752 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:42.260834 master-0 kubenswrapper[19170]: I0313 01:21:42.260824 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:45.472493 master-0 kubenswrapper[19170]: I0313 01:21:45.472392 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-g25zk"] Mar 13 01:21:45.473617 master-0 kubenswrapper[19170]: E0313 01:21:45.472899 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" containerName="installer" Mar 13 01:21:45.473617 master-0 kubenswrapper[19170]: I0313 01:21:45.472923 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" containerName="installer" Mar 13 01:21:45.473617 master-0 kubenswrapper[19170]: I0313 01:21:45.473178 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ec6670-5dff-4e77-8ea1-2ca0e3538a0d" containerName="installer" Mar 13 01:21:45.473996 master-0 kubenswrapper[19170]: I0313 01:21:45.473819 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:45.476303 master-0 kubenswrapper[19170]: I0313 01:21:45.476261 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 01:21:45.476487 master-0 kubenswrapper[19170]: I0313 01:21:45.476437 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 01:21:45.503390 master-0 kubenswrapper[19170]: I0313 01:21:45.503320 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-g25zk"] Mar 13 01:21:45.504207 master-0 kubenswrapper[19170]: I0313 01:21:45.504142 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:45.504428 master-0 kubenswrapper[19170]: I0313 01:21:45.504368 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc931c49-1d8d-4a90-845e-fe09ca00a56a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:45.606022 master-0 kubenswrapper[19170]: I0313 01:21:45.605946 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:45.606284 master-0 kubenswrapper[19170]: E0313 01:21:45.606119 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:21:45.606284 master-0 kubenswrapper[19170]: E0313 01:21:45.606215 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:21:46.106187274 +0000 UTC m=+166.914308274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:21:45.606424 master-0 kubenswrapper[19170]: I0313 01:21:45.606266 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc931c49-1d8d-4a90-845e-fe09ca00a56a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:45.608098 master-0 kubenswrapper[19170]: I0313 01:21:45.608033 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc931c49-1d8d-4a90-845e-fe09ca00a56a-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:46.114670 master-0 kubenswrapper[19170]: I0313 01:21:46.114496 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:46.115007 master-0 kubenswrapper[19170]: E0313 01:21:46.114845 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:21:46.115007 master-0 kubenswrapper[19170]: E0313 01:21:46.115001 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:21:47.114958633 +0000 UTC m=+167.923079623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:21:46.967753 master-0 kubenswrapper[19170]: I0313 01:21:46.966977 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:21:46.967753 master-0 kubenswrapper[19170]: I0313 01:21:46.967317 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" containerID="cri-o://235a13b04b9dbc747e5d8d2d8403d2908c7307c11854d776d6ea514a70233940" gracePeriod=5 Mar 13 01:21:47.133159 master-0 kubenswrapper[19170]: I0313 01:21:47.133047 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:47.133523 master-0 kubenswrapper[19170]: E0313 01:21:47.133348 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:21:47.133523 master-0 kubenswrapper[19170]: E0313 01:21:47.133484 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:21:49.133453857 +0000 UTC m=+169.941574857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:21:49.198137 master-0 kubenswrapper[19170]: I0313 01:21:49.198048 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:49.198740 master-0 kubenswrapper[19170]: E0313 01:21:49.198287 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:21:49.198740 master-0 kubenswrapper[19170]: E0313 01:21:49.198386 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:21:53.198359885 +0000 UTC m=+174.006480885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:21:49.943510 master-0 kubenswrapper[19170]: I0313 01:21:49.943414 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:49.943510 master-0 kubenswrapper[19170]: I0313 01:21:49.943489 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:21:52.261145 master-0 kubenswrapper[19170]: I0313 01:21:52.261017 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:21:52.261924 master-0 kubenswrapper[19170]: I0313 01:21:52.261852 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:21:52.440243 master-0 kubenswrapper[19170]: I0313 01:21:52.440197 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 01:21:52.440504 master-0 kubenswrapper[19170]: I0313 01:21:52.440368 19170 generic.go:334] "Generic (PLEG): container finished" podID="899242a15b2bdf3b4a04fb323647ca94" containerID="235a13b04b9dbc747e5d8d2d8403d2908c7307c11854d776d6ea514a70233940" exitCode=137 Mar 13 01:21:52.565793 master-0 kubenswrapper[19170]: I0313 01:21:52.565727 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 01:21:52.566044 master-0 kubenswrapper[19170]: I0313 01:21:52.565834 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:21:52.753917 master-0 kubenswrapper[19170]: I0313 01:21:52.753828 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 01:21:52.754180 master-0 kubenswrapper[19170]: I0313 01:21:52.753935 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 01:21:52.754180 master-0 kubenswrapper[19170]: I0313 01:21:52.753985 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 01:21:52.754180 master-0 kubenswrapper[19170]: I0313 01:21:52.754026 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 01:21:52.754180 master-0 kubenswrapper[19170]: I0313 01:21:52.754083 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 01:21:52.754600 master-0 kubenswrapper[19170]: I0313 01:21:52.754453 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests" (OuterVolumeSpecName: "manifests") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:21:52.754600 master-0 kubenswrapper[19170]: I0313 01:21:52.754505 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:21:52.754600 master-0 kubenswrapper[19170]: I0313 01:21:52.754527 19170 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 01:21:52.754600 master-0 kubenswrapper[19170]: I0313 01:21:52.754555 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log" (OuterVolumeSpecName: "var-log") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:21:52.754966 master-0 kubenswrapper[19170]: I0313 01:21:52.754609 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock" (OuterVolumeSpecName: "var-lock") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:21:52.762429 master-0 kubenswrapper[19170]: I0313 01:21:52.762366 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:21:52.855894 master-0 kubenswrapper[19170]: I0313 01:21:52.855794 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:21:52.855894 master-0 kubenswrapper[19170]: I0313 01:21:52.855862 19170 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 01:21:52.855894 master-0 kubenswrapper[19170]: I0313 01:21:52.855888 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:21:52.855894 master-0 kubenswrapper[19170]: I0313 01:21:52.855910 19170 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:21:53.262014 master-0 kubenswrapper[19170]: I0313 01:21:53.261925 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:21:53.262574 master-0 kubenswrapper[19170]: E0313 01:21:53.262131 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:21:53.262574 master-0 kubenswrapper[19170]: E0313 01:21:53.262245 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:22:01.262217084 +0000 UTC m=+182.070338064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:21:53.434286 master-0 kubenswrapper[19170]: I0313 01:21:53.434184 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899242a15b2bdf3b4a04fb323647ca94" path="/var/lib/kubelet/pods/899242a15b2bdf3b4a04fb323647ca94/volumes" Mar 13 01:21:53.454142 master-0 kubenswrapper[19170]: I0313 01:21:53.454063 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 01:21:53.454353 master-0 kubenswrapper[19170]: I0313 01:21:53.454179 19170 scope.go:117] "RemoveContainer" containerID="235a13b04b9dbc747e5d8d2d8403d2908c7307c11854d776d6ea514a70233940" Mar 13 01:21:53.454353 master-0 kubenswrapper[19170]: I0313 01:21:53.454254 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:21:59.944579 master-0 kubenswrapper[19170]: I0313 01:21:59.944466 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:21:59.945782 master-0 kubenswrapper[19170]: I0313 01:21:59.944578 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:01.325811 master-0 kubenswrapper[19170]: I0313 01:22:01.325733 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:22:01.326604 master-0 kubenswrapper[19170]: E0313 01:22:01.326000 19170 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 01:22:01.326604 master-0 kubenswrapper[19170]: E0313 01:22:01.326136 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert podName:cc931c49-1d8d-4a90-845e-fe09ca00a56a nodeName:}" failed. No retries permitted until 2026-03-13 01:22:17.32610411 +0000 UTC m=+198.134225110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-g25zk" (UID: "cc931c49-1d8d-4a90-845e-fe09ca00a56a") : secret "networking-console-plugin-cert" not found Mar 13 01:22:02.260887 master-0 kubenswrapper[19170]: I0313 01:22:02.260806 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:02.261169 master-0 kubenswrapper[19170]: I0313 01:22:02.260890 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:09.944208 master-0 kubenswrapper[19170]: I0313 01:22:09.944126 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:09.945138 master-0 kubenswrapper[19170]: I0313 01:22:09.944214 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:12.260764 master-0 kubenswrapper[19170]: I0313 01:22:12.260685 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:12.260764 master-0 kubenswrapper[19170]: I0313 01:22:12.260749 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:17.382521 master-0 kubenswrapper[19170]: I0313 01:22:17.382411 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:22:17.387393 master-0 kubenswrapper[19170]: I0313 01:22:17.387311 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cc931c49-1d8d-4a90-845e-fe09ca00a56a-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-g25zk\" (UID: \"cc931c49-1d8d-4a90-845e-fe09ca00a56a\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:22:17.616896 master-0 kubenswrapper[19170]: I0313 01:22:17.616791 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" Mar 13 01:22:18.145326 master-0 kubenswrapper[19170]: I0313 01:22:18.145234 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-g25zk"] Mar 13 01:22:18.697268 master-0 kubenswrapper[19170]: I0313 01:22:18.697210 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" event={"ID":"cc931c49-1d8d-4a90-845e-fe09ca00a56a","Type":"ContainerStarted","Data":"f1e07f6db5209ea0da880c1c497c10ab4443998cd50458a7069dd1c75f6c1609"} Mar 13 01:22:19.944150 master-0 kubenswrapper[19170]: I0313 01:22:19.944071 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:19.945031 master-0 kubenswrapper[19170]: I0313 01:22:19.944152 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:20.715985 master-0 kubenswrapper[19170]: I0313 01:22:20.715033 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" event={"ID":"cc931c49-1d8d-4a90-845e-fe09ca00a56a","Type":"ContainerStarted","Data":"e0716e78237eb48d395dc64e442fb9a92d7e37ffa20b9f92587329caeb6b6774"} Mar 13 01:22:20.742270 master-0 kubenswrapper[19170]: I0313 01:22:20.742197 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-g25zk" podStartSLOduration=34.362908002 podStartE2EDuration="35.74217579s" podCreationTimestamp="2026-03-13 01:21:45 +0000 UTC" firstStartedPulling="2026-03-13 01:22:18.146699927 +0000 UTC m=+198.954820897" lastFinishedPulling="2026-03-13 01:22:19.525967695 +0000 UTC m=+200.334088685" observedRunningTime="2026-03-13 01:22:20.736276707 +0000 UTC m=+201.544397677" watchObservedRunningTime="2026-03-13 01:22:20.74217579 +0000 UTC m=+201.550296760" Mar 13 01:22:22.260559 master-0 kubenswrapper[19170]: I0313 01:22:22.260461 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:22.261418 master-0 kubenswrapper[19170]: I0313 01:22:22.260576 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:29.944292 master-0 kubenswrapper[19170]: I0313 01:22:29.944168 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:29.944292 master-0 kubenswrapper[19170]: I0313 01:22:29.944257 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:32.260270 master-0 kubenswrapper[19170]: I0313 01:22:32.260184 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:32.260270 master-0 kubenswrapper[19170]: I0313 01:22:32.260259 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:39.943798 master-0 kubenswrapper[19170]: I0313 01:22:39.943727 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:39.944835 master-0 kubenswrapper[19170]: I0313 01:22:39.943818 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:42.259991 master-0 kubenswrapper[19170]: I0313 01:22:42.259873 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:42.259991 master-0 kubenswrapper[19170]: I0313 01:22:42.259950 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:49.944420 master-0 kubenswrapper[19170]: I0313 01:22:49.944369 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:49.945550 master-0 kubenswrapper[19170]: I0313 01:22:49.945501 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:22:52.260962 master-0 kubenswrapper[19170]: I0313 01:22:52.260865 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:22:52.261924 master-0 kubenswrapper[19170]: I0313 01:22:52.260960 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:22:59.944476 master-0 kubenswrapper[19170]: I0313 01:22:59.944364 19170 patch_prober.go:28] interesting pod/console-68ccfc6c58-cjm5c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" start-of-body= Mar 13 01:22:59.944476 master-0 kubenswrapper[19170]: I0313 01:22:59.944451 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.90:8443/health\": dial tcp 10.128.0.90:8443: connect: connection refused" Mar 13 01:23:01.101893 master-0 kubenswrapper[19170]: I0313 01:23:01.101835 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:23:01.207328 master-0 kubenswrapper[19170]: I0313 01:23:01.207260 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-864f84b8db-z7bgh"] Mar 13 01:23:01.207551 master-0 kubenswrapper[19170]: E0313 01:23:01.207536 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 01:23:01.207551 master-0 kubenswrapper[19170]: I0313 01:23:01.207549 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 01:23:01.207724 master-0 kubenswrapper[19170]: I0313 01:23:01.207698 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 01:23:01.208309 master-0 kubenswrapper[19170]: I0313 01:23:01.208243 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.222514 master-0 kubenswrapper[19170]: I0313 01:23:01.222478 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-864f84b8db-z7bgh"] Mar 13 01:23:01.397682 master-0 kubenswrapper[19170]: I0313 01:23:01.397468 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.397939 master-0 kubenswrapper[19170]: I0313 01:23:01.397677 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.397939 master-0 kubenswrapper[19170]: I0313 01:23:01.397895 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.398075 master-0 kubenswrapper[19170]: I0313 01:23:01.397970 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.398075 master-0 kubenswrapper[19170]: I0313 01:23:01.398038 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.398209 master-0 kubenswrapper[19170]: I0313 01:23:01.398150 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mxx\" (UniqueName: \"kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.398209 master-0 kubenswrapper[19170]: I0313 01:23:01.398187 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499058 master-0 kubenswrapper[19170]: I0313 01:23:01.498980 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mxx\" (UniqueName: \"kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499058 master-0 kubenswrapper[19170]: I0313 01:23:01.499061 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499387 master-0 kubenswrapper[19170]: I0313 01:23:01.499161 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499387 master-0 kubenswrapper[19170]: I0313 01:23:01.499202 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499387 master-0 kubenswrapper[19170]: I0313 01:23:01.499275 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499620 master-0 kubenswrapper[19170]: I0313 01:23:01.499538 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.499844 master-0 kubenswrapper[19170]: I0313 01:23:01.499759 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.501334 master-0 kubenswrapper[19170]: I0313 01:23:01.501274 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.501621 master-0 kubenswrapper[19170]: I0313 01:23:01.501557 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.501790 master-0 kubenswrapper[19170]: I0313 01:23:01.501739 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.502729 master-0 kubenswrapper[19170]: I0313 01:23:01.502676 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.505750 master-0 kubenswrapper[19170]: I0313 01:23:01.505685 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.508823 master-0 kubenswrapper[19170]: I0313 01:23:01.508765 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.534305 master-0 kubenswrapper[19170]: I0313 01:23:01.534260 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mxx\" (UniqueName: \"kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx\") pod \"console-864f84b8db-z7bgh\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:01.827985 master-0 kubenswrapper[19170]: I0313 01:23:01.827905 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:02.261003 master-0 kubenswrapper[19170]: I0313 01:23:02.260903 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:23:02.261562 master-0 kubenswrapper[19170]: I0313 01:23:02.261056 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:23:02.335869 master-0 kubenswrapper[19170]: I0313 01:23:02.335818 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-864f84b8db-z7bgh"] Mar 13 01:23:02.337360 master-0 kubenswrapper[19170]: W0313 01:23:02.337293 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc3c693_5b70_44ef_b53d_7a546edd268c.slice/crio-c02664bd4b8cec5a0f55c19a86a682b2128142272590fc43e989472c25afa98d WatchSource:0}: Error finding container c02664bd4b8cec5a0f55c19a86a682b2128142272590fc43e989472c25afa98d: Status 404 returned error can't find the container with id c02664bd4b8cec5a0f55c19a86a682b2128142272590fc43e989472c25afa98d Mar 13 01:23:03.085997 master-0 kubenswrapper[19170]: I0313 01:23:03.085932 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerStarted","Data":"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2"} Mar 13 01:23:03.085997 master-0 kubenswrapper[19170]: I0313 01:23:03.085993 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerStarted","Data":"c02664bd4b8cec5a0f55c19a86a682b2128142272590fc43e989472c25afa98d"} Mar 13 01:23:03.127699 master-0 kubenswrapper[19170]: I0313 01:23:03.127593 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-864f84b8db-z7bgh" podStartSLOduration=2.1275716940000002 podStartE2EDuration="2.127571694s" podCreationTimestamp="2026-03-13 01:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:23:03.120862135 +0000 UTC m=+243.928983105" watchObservedRunningTime="2026-03-13 01:23:03.127571694 +0000 UTC m=+243.935692664" Mar 13 01:23:11.829181 master-0 kubenswrapper[19170]: I0313 01:23:11.829113 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:11.830238 master-0 kubenswrapper[19170]: I0313 01:23:11.829850 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:23:11.832216 master-0 kubenswrapper[19170]: I0313 01:23:11.832120 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:23:11.832355 master-0 kubenswrapper[19170]: I0313 01:23:11.832265 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:23:12.260242 master-0 kubenswrapper[19170]: I0313 01:23:12.260149 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:23:12.260550 master-0 kubenswrapper[19170]: I0313 01:23:12.260239 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:23:18.847969 master-0 kubenswrapper[19170]: I0313 01:23:18.847877 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 13 01:23:18.850414 master-0 kubenswrapper[19170]: I0313 01:23:18.850194 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:18.855393 master-0 kubenswrapper[19170]: I0313 01:23:18.855336 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 01:23:18.861912 master-0 kubenswrapper[19170]: I0313 01:23:18.861850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-wvk2x" Mar 13 01:23:18.862979 master-0 kubenswrapper[19170]: I0313 01:23:18.861848 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 13 01:23:18.897856 master-0 kubenswrapper[19170]: I0313 01:23:18.897686 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:18.898251 master-0 kubenswrapper[19170]: I0313 01:23:18.897885 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:18.999442 master-0 kubenswrapper[19170]: I0313 01:23:18.999340 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:18.999793 master-0 kubenswrapper[19170]: I0313 01:23:18.999528 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:18.999793 master-0 kubenswrapper[19170]: I0313 01:23:18.999538 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:19.031419 master-0 kubenswrapper[19170]: I0313 01:23:19.031339 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access\") pod \"revision-pruner-7-master-0\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:19.200211 master-0 kubenswrapper[19170]: I0313 01:23:19.200126 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:19.762982 master-0 kubenswrapper[19170]: I0313 01:23:19.762894 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-7-master-0"] Mar 13 01:23:20.240949 master-0 kubenswrapper[19170]: I0313 01:23:20.240867 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"3b54e82b-e085-4722-ac15-75a08b46315b","Type":"ContainerStarted","Data":"7da3ed281acef9fd3883c072cbea25da37fec5ab4dbaf02a25d2780eb7f96c09"} Mar 13 01:23:20.643574 master-0 kubenswrapper[19170]: I0313 01:23:20.643214 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 13 01:23:20.644878 master-0 kubenswrapper[19170]: I0313 01:23:20.644822 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.656663 master-0 kubenswrapper[19170]: I0313 01:23:20.656582 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 13 01:23:20.738575 master-0 kubenswrapper[19170]: I0313 01:23:20.738495 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.740596 master-0 kubenswrapper[19170]: I0313 01:23:20.738614 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.740596 master-0 kubenswrapper[19170]: I0313 01:23:20.739740 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.840903 master-0 kubenswrapper[19170]: I0313 01:23:20.840809 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.841233 master-0 kubenswrapper[19170]: I0313 01:23:20.840983 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.841233 master-0 kubenswrapper[19170]: I0313 01:23:20.841063 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.841233 master-0 kubenswrapper[19170]: I0313 01:23:20.841205 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.841550 master-0 kubenswrapper[19170]: I0313 01:23:20.841261 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.872739 master-0 kubenswrapper[19170]: I0313 01:23:20.872679 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access\") pod \"installer-7-master-0\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:20.985116 master-0 kubenswrapper[19170]: I0313 01:23:20.985023 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:21.257420 master-0 kubenswrapper[19170]: I0313 01:23:21.257214 19170 generic.go:334] "Generic (PLEG): container finished" podID="3b54e82b-e085-4722-ac15-75a08b46315b" containerID="7121dd3cf835dff2508d15e93d384e7a810a06c0eb4c31ea8ee185ed55b47c54" exitCode=0 Mar 13 01:23:21.257420 master-0 kubenswrapper[19170]: I0313 01:23:21.257266 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"3b54e82b-e085-4722-ac15-75a08b46315b","Type":"ContainerDied","Data":"7121dd3cf835dff2508d15e93d384e7a810a06c0eb4c31ea8ee185ed55b47c54"} Mar 13 01:23:21.271097 master-0 kubenswrapper[19170]: I0313 01:23:21.271004 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-7-master-0"] Mar 13 01:23:21.277419 master-0 kubenswrapper[19170]: W0313 01:23:21.277305 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7801d684_f91c_4f55_93e7_04a104759c08.slice/crio-16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31 WatchSource:0}: Error finding container 16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31: Status 404 returned error can't find the container with id 16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31 Mar 13 01:23:21.830044 master-0 kubenswrapper[19170]: I0313 01:23:21.829840 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:23:21.830283 master-0 kubenswrapper[19170]: I0313 01:23:21.830038 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:23:22.260256 master-0 kubenswrapper[19170]: I0313 01:23:22.260192 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:23:22.261037 master-0 kubenswrapper[19170]: I0313 01:23:22.260266 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:23:22.266968 master-0 kubenswrapper[19170]: I0313 01:23:22.266815 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7801d684-f91c-4f55-93e7-04a104759c08","Type":"ContainerStarted","Data":"71856a725dc2facf87c076cbf57fe15640bc593db7bac2e62004e3375cc92a3a"} Mar 13 01:23:22.266968 master-0 kubenswrapper[19170]: I0313 01:23:22.266893 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7801d684-f91c-4f55-93e7-04a104759c08","Type":"ContainerStarted","Data":"16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31"} Mar 13 01:23:22.299169 master-0 kubenswrapper[19170]: I0313 01:23:22.299034 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-7-master-0" podStartSLOduration=2.2990024350000002 podStartE2EDuration="2.299002435s" podCreationTimestamp="2026-03-13 01:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:23:22.295369483 +0000 UTC m=+263.103490483" watchObservedRunningTime="2026-03-13 01:23:22.299002435 +0000 UTC m=+263.107123425" Mar 13 01:23:22.627389 master-0 kubenswrapper[19170]: I0313 01:23:22.627357 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:22.666740 master-0 kubenswrapper[19170]: I0313 01:23:22.666608 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access\") pod \"3b54e82b-e085-4722-ac15-75a08b46315b\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " Mar 13 01:23:22.666987 master-0 kubenswrapper[19170]: I0313 01:23:22.666950 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir\") pod \"3b54e82b-e085-4722-ac15-75a08b46315b\" (UID: \"3b54e82b-e085-4722-ac15-75a08b46315b\") " Mar 13 01:23:22.667123 master-0 kubenswrapper[19170]: I0313 01:23:22.667081 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b54e82b-e085-4722-ac15-75a08b46315b" (UID: "3b54e82b-e085-4722-ac15-75a08b46315b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:22.667649 master-0 kubenswrapper[19170]: I0313 01:23:22.667589 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b54e82b-e085-4722-ac15-75a08b46315b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:22.669949 master-0 kubenswrapper[19170]: I0313 01:23:22.669918 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b54e82b-e085-4722-ac15-75a08b46315b" (UID: "3b54e82b-e085-4722-ac15-75a08b46315b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:23:22.768916 master-0 kubenswrapper[19170]: I0313 01:23:22.768844 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b54e82b-e085-4722-ac15-75a08b46315b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:23.278083 master-0 kubenswrapper[19170]: I0313 01:23:23.278015 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-7-master-0" Mar 13 01:23:23.278921 master-0 kubenswrapper[19170]: I0313 01:23:23.278144 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-7-master-0" event={"ID":"3b54e82b-e085-4722-ac15-75a08b46315b","Type":"ContainerDied","Data":"7da3ed281acef9fd3883c072cbea25da37fec5ab4dbaf02a25d2780eb7f96c09"} Mar 13 01:23:23.278921 master-0 kubenswrapper[19170]: I0313 01:23:23.278213 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7da3ed281acef9fd3883c072cbea25da37fec5ab4dbaf02a25d2780eb7f96c09" Mar 13 01:23:26.145529 master-0 kubenswrapper[19170]: I0313 01:23:26.145366 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-68ccfc6c58-cjm5c" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" containerID="cri-o://34345f6c94e362f61167d8ab04d3f70a5c2ba66641f5d70d19ac7218528fc827" gracePeriod=15 Mar 13 01:23:26.305465 master-0 kubenswrapper[19170]: I0313 01:23:26.305383 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ccfc6c58-cjm5c_e6c9432a-bfa0-4725-8cd1-8cf2967535f5/console/0.log" Mar 13 01:23:26.305465 master-0 kubenswrapper[19170]: I0313 01:23:26.305466 19170 generic.go:334] "Generic (PLEG): container finished" podID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerID="34345f6c94e362f61167d8ab04d3f70a5c2ba66641f5d70d19ac7218528fc827" exitCode=2 Mar 13 01:23:26.306124 master-0 kubenswrapper[19170]: I0313 01:23:26.305513 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ccfc6c58-cjm5c" event={"ID":"e6c9432a-bfa0-4725-8cd1-8cf2967535f5","Type":"ContainerDied","Data":"34345f6c94e362f61167d8ab04d3f70a5c2ba66641f5d70d19ac7218528fc827"} Mar 13 01:23:26.697527 master-0 kubenswrapper[19170]: I0313 01:23:26.697467 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ccfc6c58-cjm5c_e6c9432a-bfa0-4725-8cd1-8cf2967535f5/console/0.log" Mar 13 01:23:26.697829 master-0 kubenswrapper[19170]: I0313 01:23:26.697557 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:23:26.737712 master-0 kubenswrapper[19170]: I0313 01:23:26.737593 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.737712 master-0 kubenswrapper[19170]: I0313 01:23:26.737702 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.738127 master-0 kubenswrapper[19170]: I0313 01:23:26.737762 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.738127 master-0 kubenswrapper[19170]: I0313 01:23:26.737811 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhrtt\" (UniqueName: \"kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.738127 master-0 kubenswrapper[19170]: I0313 01:23:26.737880 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.738127 master-0 kubenswrapper[19170]: I0313 01:23:26.738031 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config\") pod \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\" (UID: \"e6c9432a-bfa0-4725-8cd1-8cf2967535f5\") " Mar 13 01:23:26.741215 master-0 kubenswrapper[19170]: I0313 01:23:26.741064 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:26.744538 master-0 kubenswrapper[19170]: I0313 01:23:26.743711 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config" (OuterVolumeSpecName: "console-config") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:26.749170 master-0 kubenswrapper[19170]: I0313 01:23:26.749065 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt" (OuterVolumeSpecName: "kube-api-access-nhrtt") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "kube-api-access-nhrtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:23:26.751098 master-0 kubenswrapper[19170]: I0313 01:23:26.751041 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:26.752773 master-0 kubenswrapper[19170]: I0313 01:23:26.752711 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:26.753096 master-0 kubenswrapper[19170]: I0313 01:23:26.752377 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e6c9432a-bfa0-4725-8cd1-8cf2967535f5" (UID: "e6c9432a-bfa0-4725-8cd1-8cf2967535f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:26.854652 master-0 kubenswrapper[19170]: I0313 01:23:26.854570 19170 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:26.854848 master-0 kubenswrapper[19170]: I0313 01:23:26.854660 19170 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:26.854848 master-0 kubenswrapper[19170]: I0313 01:23:26.854685 19170 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:26.854848 master-0 kubenswrapper[19170]: I0313 01:23:26.854703 19170 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:26.854848 master-0 kubenswrapper[19170]: I0313 01:23:26.854721 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhrtt\" (UniqueName: \"kubernetes.io/projected/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-kube-api-access-nhrtt\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:26.854848 master-0 kubenswrapper[19170]: I0313 01:23:26.854738 19170 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6c9432a-bfa0-4725-8cd1-8cf2967535f5-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:27.316906 master-0 kubenswrapper[19170]: I0313 01:23:27.316833 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ccfc6c58-cjm5c_e6c9432a-bfa0-4725-8cd1-8cf2967535f5/console/0.log" Mar 13 01:23:27.317891 master-0 kubenswrapper[19170]: I0313 01:23:27.316943 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ccfc6c58-cjm5c" event={"ID":"e6c9432a-bfa0-4725-8cd1-8cf2967535f5","Type":"ContainerDied","Data":"ebcc7af8dcaf791a1a0e9aee557f3ea2b39086b684a51e6692020026e7866742"} Mar 13 01:23:27.317891 master-0 kubenswrapper[19170]: I0313 01:23:27.317028 19170 scope.go:117] "RemoveContainer" containerID="34345f6c94e362f61167d8ab04d3f70a5c2ba66641f5d70d19ac7218528fc827" Mar 13 01:23:27.317891 master-0 kubenswrapper[19170]: I0313 01:23:27.317057 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ccfc6c58-cjm5c" Mar 13 01:23:27.363708 master-0 kubenswrapper[19170]: I0313 01:23:27.363606 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:23:27.373575 master-0 kubenswrapper[19170]: I0313 01:23:27.373512 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68ccfc6c58-cjm5c"] Mar 13 01:23:27.428859 master-0 kubenswrapper[19170]: I0313 01:23:27.428789 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" path="/var/lib/kubelet/pods/e6c9432a-bfa0-4725-8cd1-8cf2967535f5/volumes" Mar 13 01:23:31.828893 master-0 kubenswrapper[19170]: I0313 01:23:31.828825 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:23:31.830242 master-0 kubenswrapper[19170]: I0313 01:23:31.829960 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:23:32.260092 master-0 kubenswrapper[19170]: I0313 01:23:32.260027 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:23:32.260092 master-0 kubenswrapper[19170]: I0313 01:23:32.260105 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:23:33.294729 master-0 kubenswrapper[19170]: I0313 01:23:33.294681 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: E0313 01:23:33.294920 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: I0313 01:23:33.294933 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: E0313 01:23:33.294947 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b54e82b-e085-4722-ac15-75a08b46315b" containerName="pruner" Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: I0313 01:23:33.294953 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b54e82b-e085-4722-ac15-75a08b46315b" containerName="pruner" Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: I0313 01:23:33.295072 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b54e82b-e085-4722-ac15-75a08b46315b" containerName="pruner" Mar 13 01:23:33.295200 master-0 kubenswrapper[19170]: I0313 01:23:33.295103 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c9432a-bfa0-4725-8cd1-8cf2967535f5" containerName="console" Mar 13 01:23:33.296697 master-0 kubenswrapper[19170]: I0313 01:23:33.296676 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.298680 master-0 kubenswrapper[19170]: I0313 01:23:33.298614 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 01:23:33.299725 master-0 kubenswrapper[19170]: I0313 01:23:33.299708 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 01:23:33.299879 master-0 kubenswrapper[19170]: I0313 01:23:33.299862 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 01:23:33.300014 master-0 kubenswrapper[19170]: I0313 01:23:33.299999 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 01:23:33.300208 master-0 kubenswrapper[19170]: I0313 01:23:33.300196 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 01:23:33.300355 master-0 kubenswrapper[19170]: I0313 01:23:33.300342 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 01:23:33.301668 master-0 kubenswrapper[19170]: I0313 01:23:33.301645 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 01:23:33.308509 master-0 kubenswrapper[19170]: I0313 01:23:33.308468 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 01:23:33.321056 master-0 kubenswrapper[19170]: I0313 01:23:33.321008 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 01:23:33.354607 master-0 kubenswrapper[19170]: I0313 01:23:33.354539 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354607 master-0 kubenswrapper[19170]: I0313 01:23:33.354590 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354647 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354674 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354719 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354740 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354765 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354805 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354840 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354861 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354891 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.354974 master-0 kubenswrapper[19170]: I0313 01:23:33.354926 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wlk\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-kube-api-access-w6wlk\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456039 master-0 kubenswrapper[19170]: I0313 01:23:33.455988 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456039 master-0 kubenswrapper[19170]: I0313 01:23:33.456035 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456260 master-0 kubenswrapper[19170]: I0313 01:23:33.456061 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456355 master-0 kubenswrapper[19170]: I0313 01:23:33.456335 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6wlk\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-kube-api-access-w6wlk\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456394 master-0 kubenswrapper[19170]: I0313 01:23:33.456364 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456394 master-0 kubenswrapper[19170]: I0313 01:23:33.456380 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456449 master-0 kubenswrapper[19170]: I0313 01:23:33.456405 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456449 master-0 kubenswrapper[19170]: I0313 01:23:33.456422 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456513 master-0 kubenswrapper[19170]: I0313 01:23:33.456454 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456626 master-0 kubenswrapper[19170]: I0313 01:23:33.456589 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.456682 master-0 kubenswrapper[19170]: I0313 01:23:33.456645 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.457283 master-0 kubenswrapper[19170]: I0313 01:23:33.457072 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.457283 master-0 kubenswrapper[19170]: I0313 01:23:33.457136 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.457383 master-0 kubenswrapper[19170]: I0313 01:23:33.457328 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.458219 master-0 kubenswrapper[19170]: I0313 01:23:33.458197 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/587a3378-9d36-4151-8caa-959199396bf2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.459908 master-0 kubenswrapper[19170]: I0313 01:23:33.459468 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-web-config\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.459908 master-0 kubenswrapper[19170]: I0313 01:23:33.459578 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.459908 master-0 kubenswrapper[19170]: I0313 01:23:33.459859 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.460424 master-0 kubenswrapper[19170]: I0313 01:23:33.460390 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-config-volume\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.460603 master-0 kubenswrapper[19170]: I0313 01:23:33.460577 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.461143 master-0 kubenswrapper[19170]: I0313 01:23:33.461116 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/587a3378-9d36-4151-8caa-959199396bf2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.461210 master-0 kubenswrapper[19170]: I0313 01:23:33.461117 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.461445 master-0 kubenswrapper[19170]: I0313 01:23:33.461414 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/587a3378-9d36-4151-8caa-959199396bf2-config-out\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.480708 master-0 kubenswrapper[19170]: I0313 01:23:33.480677 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6wlk\" (UniqueName: \"kubernetes.io/projected/587a3378-9d36-4151-8caa-959199396bf2-kube-api-access-w6wlk\") pod \"alertmanager-main-0\" (UID: \"587a3378-9d36-4151-8caa-959199396bf2\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:33.615019 master-0 kubenswrapper[19170]: I0313 01:23:33.614916 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 01:23:34.374229 master-0 kubenswrapper[19170]: I0313 01:23:34.374166 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f6d58f575-sz96g"] Mar 13 01:23:34.376075 master-0 kubenswrapper[19170]: I0313 01:23:34.376044 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.379570 master-0 kubenswrapper[19170]: I0313 01:23:34.379336 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 01:23:34.379570 master-0 kubenswrapper[19170]: I0313 01:23:34.379356 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 01:23:34.379570 master-0 kubenswrapper[19170]: I0313 01:23:34.379425 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3ahq1q95btnqo" Mar 13 01:23:34.379570 master-0 kubenswrapper[19170]: I0313 01:23:34.379354 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 01:23:34.379802 master-0 kubenswrapper[19170]: I0313 01:23:34.379610 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 01:23:34.379802 master-0 kubenswrapper[19170]: I0313 01:23:34.379744 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 01:23:34.397501 master-0 kubenswrapper[19170]: I0313 01:23:34.397466 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f6d58f575-sz96g"] Mar 13 01:23:34.571036 master-0 kubenswrapper[19170]: I0313 01:23:34.570979 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-grpc-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571572 master-0 kubenswrapper[19170]: I0313 01:23:34.571513 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571651 master-0 kubenswrapper[19170]: I0313 01:23:34.571613 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571795 master-0 kubenswrapper[19170]: I0313 01:23:34.571762 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2569e-0491-45c4-b34a-84d9f8b971e9-metrics-client-ca\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571865 master-0 kubenswrapper[19170]: I0313 01:23:34.571809 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfjb\" (UniqueName: \"kubernetes.io/projected/62c2569e-0491-45c4-b34a-84d9f8b971e9-kube-api-access-2xfjb\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571865 master-0 kubenswrapper[19170]: I0313 01:23:34.571840 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.571952 master-0 kubenswrapper[19170]: I0313 01:23:34.571901 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.572383 master-0 kubenswrapper[19170]: I0313 01:23:34.572249 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.626245 master-0 kubenswrapper[19170]: I0313 01:23:34.626121 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 01:23:34.640090 master-0 kubenswrapper[19170]: W0313 01:23:34.640035 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587a3378_9d36_4151_8caa_959199396bf2.slice/crio-7a97b2c552b429d166f04e6704c06067ad642e02c7e676908e2b291ddd39182e WatchSource:0}: Error finding container 7a97b2c552b429d166f04e6704c06067ad642e02c7e676908e2b291ddd39182e: Status 404 returned error can't find the container with id 7a97b2c552b429d166f04e6704c06067ad642e02c7e676908e2b291ddd39182e Mar 13 01:23:34.673988 master-0 kubenswrapper[19170]: I0313 01:23:34.673916 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674169 master-0 kubenswrapper[19170]: I0313 01:23:34.674030 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674169 master-0 kubenswrapper[19170]: I0313 01:23:34.674115 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-grpc-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674255 master-0 kubenswrapper[19170]: I0313 01:23:34.674186 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674255 master-0 kubenswrapper[19170]: I0313 01:23:34.674226 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674322 master-0 kubenswrapper[19170]: I0313 01:23:34.674287 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2569e-0491-45c4-b34a-84d9f8b971e9-metrics-client-ca\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674355 master-0 kubenswrapper[19170]: I0313 01:23:34.674330 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfjb\" (UniqueName: \"kubernetes.io/projected/62c2569e-0491-45c4-b34a-84d9f8b971e9-kube-api-access-2xfjb\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.674388 master-0 kubenswrapper[19170]: I0313 01:23:34.674366 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.676339 master-0 kubenswrapper[19170]: I0313 01:23:34.676303 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62c2569e-0491-45c4-b34a-84d9f8b971e9-metrics-client-ca\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.678184 master-0 kubenswrapper[19170]: I0313 01:23:34.678153 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.679460 master-0 kubenswrapper[19170]: I0313 01:23:34.679389 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-grpc-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.679868 master-0 kubenswrapper[19170]: I0313 01:23:34.679819 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.680923 master-0 kubenswrapper[19170]: I0313 01:23:34.680520 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-tls\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.681686 master-0 kubenswrapper[19170]: I0313 01:23:34.681661 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.682988 master-0 kubenswrapper[19170]: I0313 01:23:34.682934 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62c2569e-0491-45c4-b34a-84d9f8b971e9-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:34.703258 master-0 kubenswrapper[19170]: I0313 01:23:34.703199 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfjb\" (UniqueName: \"kubernetes.io/projected/62c2569e-0491-45c4-b34a-84d9f8b971e9-kube-api-access-2xfjb\") pod \"thanos-querier-7f6d58f575-sz96g\" (UID: \"62c2569e-0491-45c4-b34a-84d9f8b971e9\") " pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:35.001272 master-0 kubenswrapper[19170]: I0313 01:23:35.001189 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:35.448946 master-0 kubenswrapper[19170]: I0313 01:23:35.448891 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"7a97b2c552b429d166f04e6704c06067ad642e02c7e676908e2b291ddd39182e"} Mar 13 01:23:36.278705 master-0 kubenswrapper[19170]: I0313 01:23:36.278154 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f6d58f575-sz96g"] Mar 13 01:23:36.457351 master-0 kubenswrapper[19170]: I0313 01:23:36.457284 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"d7001cb342e20622eff40af73cf64cd083d350fc37e31edee7a1baac62de3fe6"} Mar 13 01:23:37.204557 master-0 kubenswrapper[19170]: I0313 01:23:37.204526 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56874ddc8c-r9wp8"] Mar 13 01:23:37.205876 master-0 kubenswrapper[19170]: I0313 01:23:37.205861 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.210453 master-0 kubenswrapper[19170]: I0313 01:23:37.209194 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 13 01:23:37.210528 master-0 kubenswrapper[19170]: I0313 01:23:37.210508 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-ntqww" Mar 13 01:23:37.210687 master-0 kubenswrapper[19170]: I0313 01:23:37.210669 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 13 01:23:37.210770 master-0 kubenswrapper[19170]: I0313 01:23:37.210697 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 13 01:23:37.210900 master-0 kubenswrapper[19170]: I0313 01:23:37.210860 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 13 01:23:37.214349 master-0 kubenswrapper[19170]: I0313 01:23:37.214302 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 13 01:23:37.218215 master-0 kubenswrapper[19170]: I0313 01:23:37.218197 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 13 01:23:37.255657 master-0 kubenswrapper[19170]: I0313 01:23:37.255444 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56874ddc8c-r9wp8"] Mar 13 01:23:37.339345 master-0 kubenswrapper[19170]: I0313 01:23:37.339264 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.341699 master-0 kubenswrapper[19170]: I0313 01:23:37.339442 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.341786 master-0 kubenswrapper[19170]: I0313 01:23:37.341767 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-serving-certs-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.341852 master-0 kubenswrapper[19170]: I0313 01:23:37.341823 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-metrics-client-ca\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.342052 master-0 kubenswrapper[19170]: I0313 01:23:37.342017 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.342117 master-0 kubenswrapper[19170]: I0313 01:23:37.342087 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-federate-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.342161 master-0 kubenswrapper[19170]: I0313 01:23:37.342130 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.342247 master-0 kubenswrapper[19170]: I0313 01:23:37.342211 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxp6z\" (UniqueName: \"kubernetes.io/projected/c108cdf5-97c6-472f-b52e-b2cae013d0e8-kube-api-access-xxp6z\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.362325 master-0 kubenswrapper[19170]: I0313 01:23:37.362268 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5dcbdc8c89-87sck"] Mar 13 01:23:37.363329 master-0 kubenswrapper[19170]: I0313 01:23:37.363207 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.365292 master-0 kubenswrapper[19170]: I0313 01:23:37.365252 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-1fft5pqda64sn" Mar 13 01:23:37.372259 master-0 kubenswrapper[19170]: I0313 01:23:37.371800 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-5575f756f4-hqr5q"] Mar 13 01:23:37.372259 master-0 kubenswrapper[19170]: I0313 01:23:37.372058 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" containerName="metrics-server" containerID="cri-o://177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb" gracePeriod=170 Mar 13 01:23:37.377948 master-0 kubenswrapper[19170]: I0313 01:23:37.377685 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5dcbdc8c89-87sck"] Mar 13 01:23:37.443549 master-0 kubenswrapper[19170]: I0313 01:23:37.443468 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.443659 master-0 kubenswrapper[19170]: I0313 01:23:37.443583 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-federate-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.443659 master-0 kubenswrapper[19170]: I0313 01:23:37.443613 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.444330 master-0 kubenswrapper[19170]: I0313 01:23:37.444308 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxp6z\" (UniqueName: \"kubernetes.io/projected/c108cdf5-97c6-472f-b52e-b2cae013d0e8-kube-api-access-xxp6z\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446067 master-0 kubenswrapper[19170]: I0313 01:23:37.444435 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446067 master-0 kubenswrapper[19170]: I0313 01:23:37.444492 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446067 master-0 kubenswrapper[19170]: I0313 01:23:37.444526 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-serving-certs-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446067 master-0 kubenswrapper[19170]: I0313 01:23:37.444555 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-metrics-client-ca\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446287 master-0 kubenswrapper[19170]: I0313 01:23:37.446213 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-serving-certs-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.446496 master-0 kubenswrapper[19170]: I0313 01:23:37.446437 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.448734 master-0 kubenswrapper[19170]: I0313 01:23:37.447135 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-federate-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.448734 master-0 kubenswrapper[19170]: I0313 01:23:37.447345 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c108cdf5-97c6-472f-b52e-b2cae013d0e8-metrics-client-ca\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.448734 master-0 kubenswrapper[19170]: I0313 01:23:37.448017 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-telemeter-client-tls\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.448734 master-0 kubenswrapper[19170]: I0313 01:23:37.448699 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.450275 master-0 kubenswrapper[19170]: I0313 01:23:37.449773 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c108cdf5-97c6-472f-b52e-b2cae013d0e8-secret-telemeter-client\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.459531 master-0 kubenswrapper[19170]: I0313 01:23:37.459491 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxp6z\" (UniqueName: \"kubernetes.io/projected/c108cdf5-97c6-472f-b52e-b2cae013d0e8-kube-api-access-xxp6z\") pod \"telemeter-client-56874ddc8c-r9wp8\" (UID: \"c108cdf5-97c6-472f-b52e-b2cae013d0e8\") " pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.466270 master-0 kubenswrapper[19170]: I0313 01:23:37.466209 19170 generic.go:334] "Generic (PLEG): container finished" podID="587a3378-9d36-4151-8caa-959199396bf2" containerID="9762577164b2b55ae710a39a07116196d983437ac3193719074040306fc16001" exitCode=0 Mar 13 01:23:37.466270 master-0 kubenswrapper[19170]: I0313 01:23:37.466262 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerDied","Data":"9762577164b2b55ae710a39a07116196d983437ac3193719074040306fc16001"} Mar 13 01:23:37.537745 master-0 kubenswrapper[19170]: I0313 01:23:37.537680 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" Mar 13 01:23:37.545805 master-0 kubenswrapper[19170]: I0313 01:23:37.545772 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-metrics-server-audit-profiles\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546220 master-0 kubenswrapper[19170]: I0313 01:23:37.546179 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546575 master-0 kubenswrapper[19170]: I0313 01:23:37.546539 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/25a82f62-aee8-4b1b-84aa-c119d0351412-audit-log\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546663 master-0 kubenswrapper[19170]: I0313 01:23:37.546617 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgsvc\" (UniqueName: \"kubernetes.io/projected/25a82f62-aee8-4b1b-84aa-c119d0351412-kube-api-access-vgsvc\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546734 master-0 kubenswrapper[19170]: I0313 01:23:37.546711 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-server-tls\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546775 master-0 kubenswrapper[19170]: I0313 01:23:37.546752 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-client-certs\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.546988 master-0 kubenswrapper[19170]: I0313 01:23:37.546961 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-client-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648227 master-0 kubenswrapper[19170]: I0313 01:23:37.648180 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-metrics-server-audit-profiles\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648420 master-0 kubenswrapper[19170]: I0313 01:23:37.648246 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648420 master-0 kubenswrapper[19170]: I0313 01:23:37.648301 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/25a82f62-aee8-4b1b-84aa-c119d0351412-audit-log\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648420 master-0 kubenswrapper[19170]: I0313 01:23:37.648345 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgsvc\" (UniqueName: \"kubernetes.io/projected/25a82f62-aee8-4b1b-84aa-c119d0351412-kube-api-access-vgsvc\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648420 master-0 kubenswrapper[19170]: I0313 01:23:37.648387 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-server-tls\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648420 master-0 kubenswrapper[19170]: I0313 01:23:37.648417 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-client-certs\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.648626 master-0 kubenswrapper[19170]: I0313 01:23:37.648456 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-client-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.649176 master-0 kubenswrapper[19170]: I0313 01:23:37.649128 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.649542 master-0 kubenswrapper[19170]: I0313 01:23:37.649492 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/25a82f62-aee8-4b1b-84aa-c119d0351412-metrics-server-audit-profiles\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.649954 master-0 kubenswrapper[19170]: I0313 01:23:37.649880 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/25a82f62-aee8-4b1b-84aa-c119d0351412-audit-log\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.659696 master-0 kubenswrapper[19170]: I0313 01:23:37.658183 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-server-tls\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.661254 master-0 kubenswrapper[19170]: I0313 01:23:37.661225 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-client-ca-bundle\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.661330 master-0 kubenswrapper[19170]: I0313 01:23:37.661271 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25a82f62-aee8-4b1b-84aa-c119d0351412-secret-metrics-client-certs\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.664755 master-0 kubenswrapper[19170]: I0313 01:23:37.663995 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgsvc\" (UniqueName: \"kubernetes.io/projected/25a82f62-aee8-4b1b-84aa-c119d0351412-kube-api-access-vgsvc\") pod \"metrics-server-5dcbdc8c89-87sck\" (UID: \"25a82f62-aee8-4b1b-84aa-c119d0351412\") " pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.689680 master-0 kubenswrapper[19170]: I0313 01:23:37.689585 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:37.964874 master-0 kubenswrapper[19170]: I0313 01:23:37.964800 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56874ddc8c-r9wp8"] Mar 13 01:23:38.077168 master-0 kubenswrapper[19170]: W0313 01:23:38.076616 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a82f62_aee8_4b1b_84aa_c119d0351412.slice/crio-2cc3be2d3ddebc58a2a914d605008e08d52bae4a8d4fba49cb42d12a604afbda WatchSource:0}: Error finding container 2cc3be2d3ddebc58a2a914d605008e08d52bae4a8d4fba49cb42d12a604afbda: Status 404 returned error can't find the container with id 2cc3be2d3ddebc58a2a914d605008e08d52bae4a8d4fba49cb42d12a604afbda Mar 13 01:23:38.077168 master-0 kubenswrapper[19170]: I0313 01:23:38.076936 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5dcbdc8c89-87sck"] Mar 13 01:23:38.474544 master-0 kubenswrapper[19170]: I0313 01:23:38.474476 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" event={"ID":"c108cdf5-97c6-472f-b52e-b2cae013d0e8","Type":"ContainerStarted","Data":"3101475f17e0b35b5cf13f316bf36a1f6db51ef2469296734f56b0d4674921f0"} Mar 13 01:23:38.476304 master-0 kubenswrapper[19170]: I0313 01:23:38.476265 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" event={"ID":"25a82f62-aee8-4b1b-84aa-c119d0351412","Type":"ContainerStarted","Data":"3d6f35a80ae6c9dbc0227fee707be4c870ee05e57ad6714a85f8769c687747e5"} Mar 13 01:23:38.476304 master-0 kubenswrapper[19170]: I0313 01:23:38.476300 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" event={"ID":"25a82f62-aee8-4b1b-84aa-c119d0351412","Type":"ContainerStarted","Data":"2cc3be2d3ddebc58a2a914d605008e08d52bae4a8d4fba49cb42d12a604afbda"} Mar 13 01:23:38.508492 master-0 kubenswrapper[19170]: I0313 01:23:38.508370 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" podStartSLOduration=1.5083455190000001 podStartE2EDuration="1.508345519s" podCreationTimestamp="2026-03-13 01:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:23:38.503724919 +0000 UTC m=+279.311845879" watchObservedRunningTime="2026-03-13 01:23:38.508345519 +0000 UTC m=+279.316466489" Mar 13 01:23:39.050053 master-0 kubenswrapper[19170]: I0313 01:23:39.049974 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 01:23:39.052962 master-0 kubenswrapper[19170]: I0313 01:23:39.052929 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.056273 master-0 kubenswrapper[19170]: I0313 01:23:39.056236 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 01:23:39.057606 master-0 kubenswrapper[19170]: I0313 01:23:39.057580 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 01:23:39.057762 master-0 kubenswrapper[19170]: I0313 01:23:39.057717 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 01:23:39.057893 master-0 kubenswrapper[19170]: I0313 01:23:39.057850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 01:23:39.058120 master-0 kubenswrapper[19170]: I0313 01:23:39.058090 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 01:23:39.058169 master-0 kubenswrapper[19170]: I0313 01:23:39.058091 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 01:23:39.058549 master-0 kubenswrapper[19170]: I0313 01:23:39.058511 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 01:23:39.058889 master-0 kubenswrapper[19170]: I0313 01:23:39.058850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 01:23:39.058945 master-0 kubenswrapper[19170]: I0313 01:23:39.058892 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-cg3teed5h7t4o" Mar 13 01:23:39.059178 master-0 kubenswrapper[19170]: I0313 01:23:39.059152 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 01:23:39.062671 master-0 kubenswrapper[19170]: I0313 01:23:39.062641 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 01:23:39.064182 master-0 kubenswrapper[19170]: I0313 01:23:39.064115 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 01:23:39.072397 master-0 kubenswrapper[19170]: I0313 01:23:39.072133 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076548 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076600 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076623 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076663 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076687 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076726 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076756 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076773 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.076807 master-0 kubenswrapper[19170]: I0313 01:23:39.076805 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076824 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4bf\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-kube-api-access-tl4bf\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076843 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-config-out\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076859 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076903 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076924 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-web-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076940 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076974 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.076990 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.077223 master-0 kubenswrapper[19170]: I0313 01:23:39.077009 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179011 master-0 kubenswrapper[19170]: I0313 01:23:39.178933 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179251 master-0 kubenswrapper[19170]: I0313 01:23:39.179102 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179251 master-0 kubenswrapper[19170]: I0313 01:23:39.179141 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179251 master-0 kubenswrapper[19170]: I0313 01:23:39.179173 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179251 master-0 kubenswrapper[19170]: I0313 01:23:39.179203 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4bf\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-kube-api-access-tl4bf\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179251 master-0 kubenswrapper[19170]: I0313 01:23:39.179232 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-config-out\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179258 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179304 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179338 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-web-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179359 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179388 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179412 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.179459 master-0 kubenswrapper[19170]: I0313 01:23:39.179442 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.180792 master-0 kubenswrapper[19170]: I0313 01:23:39.180754 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.181653 master-0 kubenswrapper[19170]: I0313 01:23:39.181555 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182027 master-0 kubenswrapper[19170]: I0313 01:23:39.181920 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182027 master-0 kubenswrapper[19170]: I0313 01:23:39.182002 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182583 master-0 kubenswrapper[19170]: I0313 01:23:39.182037 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182583 master-0 kubenswrapper[19170]: I0313 01:23:39.182073 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182583 master-0 kubenswrapper[19170]: I0313 01:23:39.182132 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182583 master-0 kubenswrapper[19170]: I0313 01:23:39.182347 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.182886 master-0 kubenswrapper[19170]: I0313 01:23:39.182765 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.183386 master-0 kubenswrapper[19170]: I0313 01:23:39.183305 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.183565 master-0 kubenswrapper[19170]: I0313 01:23:39.183538 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/57aeb432-8fc2-442d-9d95-0e86689ee923-config-out\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.184020 master-0 kubenswrapper[19170]: I0313 01:23:39.183992 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.185898 master-0 kubenswrapper[19170]: I0313 01:23:39.185869 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.186285 master-0 kubenswrapper[19170]: I0313 01:23:39.186250 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.187464 master-0 kubenswrapper[19170]: I0313 01:23:39.187036 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.187464 master-0 kubenswrapper[19170]: I0313 01:23:39.187082 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.187464 master-0 kubenswrapper[19170]: I0313 01:23:39.187411 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.187919 master-0 kubenswrapper[19170]: I0313 01:23:39.187899 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-web-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.191998 master-0 kubenswrapper[19170]: I0313 01:23:39.191959 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-config\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.195391 master-0 kubenswrapper[19170]: I0313 01:23:39.195351 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.197299 master-0 kubenswrapper[19170]: I0313 01:23:39.197230 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/57aeb432-8fc2-442d-9d95-0e86689ee923-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.199220 master-0 kubenswrapper[19170]: I0313 01:23:39.199198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4bf\" (UniqueName: \"kubernetes.io/projected/57aeb432-8fc2-442d-9d95-0e86689ee923-kube-api-access-tl4bf\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.205620 master-0 kubenswrapper[19170]: I0313 01:23:39.205562 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/57aeb432-8fc2-442d-9d95-0e86689ee923-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"57aeb432-8fc2-442d-9d95-0e86689ee923\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:39.382091 master-0 kubenswrapper[19170]: I0313 01:23:39.381948 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:40.508556 master-0 kubenswrapper[19170]: I0313 01:23:40.508500 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"50cf94da4bfbb1e268cb2d5a05e82474bdca063e1ed31f69cc9c55549717510f"} Mar 13 01:23:40.514123 master-0 kubenswrapper[19170]: I0313 01:23:40.513940 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"bebab5f65bc41aa94df270b51d52c151625647a48e204ba9d66d6d14e6416283"} Mar 13 01:23:40.538145 master-0 kubenswrapper[19170]: I0313 01:23:40.537900 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 01:23:41.522581 master-0 kubenswrapper[19170]: I0313 01:23:41.522527 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"abd6fdc6bf140114bb48afeed78e63382d8df88f827bbcb9f169364053c5d684"} Mar 13 01:23:41.522581 master-0 kubenswrapper[19170]: I0313 01:23:41.522580 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"c816f0421237214ff041a136380221d68d8ce24c62068f10a9fa12f9fcb35f20"} Mar 13 01:23:41.524108 master-0 kubenswrapper[19170]: I0313 01:23:41.524074 19170 generic.go:334] "Generic (PLEG): container finished" podID="57aeb432-8fc2-442d-9d95-0e86689ee923" containerID="293f06e9e51ec83f095f2e012838493dbcbd6a87fdf88c608e14c096e669ddf9" exitCode=0 Mar 13 01:23:41.524226 master-0 kubenswrapper[19170]: I0313 01:23:41.524186 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerDied","Data":"293f06e9e51ec83f095f2e012838493dbcbd6a87fdf88c608e14c096e669ddf9"} Mar 13 01:23:41.524327 master-0 kubenswrapper[19170]: I0313 01:23:41.524313 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"d843c2f74231f44fc70a51ea71f75d3252eab0767c6fc71c37505f42c22e7731"} Mar 13 01:23:41.531118 master-0 kubenswrapper[19170]: I0313 01:23:41.531069 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"a977d589648218932503c6ce7574470eb748a650d126421c6455f71ae8e52c33"} Mar 13 01:23:41.531203 master-0 kubenswrapper[19170]: I0313 01:23:41.531121 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"cb1e44143de5165ad2a6566e30da02fbd4763a0ac65a4efc43ad9913c089fd0b"} Mar 13 01:23:41.531203 master-0 kubenswrapper[19170]: I0313 01:23:41.531138 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"e7603bc7188ae5b64d4d85df1f525dffbb2b0acfb551b432e6b29184e5dfd667"} Mar 13 01:23:41.531203 master-0 kubenswrapper[19170]: I0313 01:23:41.531151 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"2930a5b44e06aa8deb13102bac5d4e9847fc80a9a1b11d5b8f8a2fce4dcf7c43"} Mar 13 01:23:41.830469 master-0 kubenswrapper[19170]: I0313 01:23:41.830407 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:23:41.830682 master-0 kubenswrapper[19170]: I0313 01:23:41.830483 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:23:42.260027 master-0 kubenswrapper[19170]: I0313 01:23:42.259958 19170 patch_prober.go:28] interesting pod/console-575758dfc4-r6mb4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 13 01:23:42.260177 master-0 kubenswrapper[19170]: I0313 01:23:42.260032 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 13 01:23:42.551536 master-0 kubenswrapper[19170]: I0313 01:23:42.551479 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"76713fd1bc566715f1f1c9e80c088fa547d66d95f8816d38f59e74478bec6660"} Mar 13 01:23:42.555746 master-0 kubenswrapper[19170]: I0313 01:23:42.555446 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"587a3378-9d36-4151-8caa-959199396bf2","Type":"ContainerStarted","Data":"5c31ec0f1b2961119212eb3426c0612fa2492f4af13a7f8e8a5646a4d1f396e5"} Mar 13 01:23:42.560714 master-0 kubenswrapper[19170]: I0313 01:23:42.559210 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" event={"ID":"c108cdf5-97c6-472f-b52e-b2cae013d0e8","Type":"ContainerStarted","Data":"b0f244cae4f178931656b01f2086a7617ba1c949660bc092bbfe12e720865221"} Mar 13 01:23:42.560714 master-0 kubenswrapper[19170]: I0313 01:23:42.559272 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" event={"ID":"c108cdf5-97c6-472f-b52e-b2cae013d0e8","Type":"ContainerStarted","Data":"038967fdbddc680e28bee54035d76c09691f8b7510be86f001962b6814b1895c"} Mar 13 01:23:42.560714 master-0 kubenswrapper[19170]: I0313 01:23:42.559285 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" event={"ID":"c108cdf5-97c6-472f-b52e-b2cae013d0e8","Type":"ContainerStarted","Data":"a0112c5afa6b09c5fbb0c057b3375c7badeb30916b3afad8cd87b703766cafe1"} Mar 13 01:23:42.598755 master-0 kubenswrapper[19170]: I0313 01:23:42.596577 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.984550584 podStartE2EDuration="9.596554263s" podCreationTimestamp="2026-03-13 01:23:33 +0000 UTC" firstStartedPulling="2026-03-13 01:23:34.644718512 +0000 UTC m=+275.452839482" lastFinishedPulling="2026-03-13 01:23:42.256722181 +0000 UTC m=+283.064843161" observedRunningTime="2026-03-13 01:23:42.590235905 +0000 UTC m=+283.398356865" watchObservedRunningTime="2026-03-13 01:23:42.596554263 +0000 UTC m=+283.404675223" Mar 13 01:23:42.646671 master-0 kubenswrapper[19170]: I0313 01:23:42.646577 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56874ddc8c-r9wp8" podStartSLOduration=2.006529281 podStartE2EDuration="5.646557252s" podCreationTimestamp="2026-03-13 01:23:37 +0000 UTC" firstStartedPulling="2026-03-13 01:23:37.982509898 +0000 UTC m=+278.790630898" lastFinishedPulling="2026-03-13 01:23:41.622537909 +0000 UTC m=+282.430658869" observedRunningTime="2026-03-13 01:23:42.641443928 +0000 UTC m=+283.449564898" watchObservedRunningTime="2026-03-13 01:23:42.646557252 +0000 UTC m=+283.454678222" Mar 13 01:23:43.594647 master-0 kubenswrapper[19170]: I0313 01:23:43.593683 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:23:43.601435 master-0 kubenswrapper[19170]: I0313 01:23:43.601342 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"3f938e8516cfb3b2cecdedd945db345c20ee9132306755ed6bb616533e5691f2"} Mar 13 01:23:43.601598 master-0 kubenswrapper[19170]: I0313 01:23:43.601461 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" event={"ID":"62c2569e-0491-45c4-b34a-84d9f8b971e9","Type":"ContainerStarted","Data":"21cc2145f7e31a36c7fbc08f42dfa58ce1368545c449c856c09fd86d59630b83"} Mar 13 01:23:43.640675 master-0 kubenswrapper[19170]: I0313 01:23:43.640602 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:23:43.641608 master-0 kubenswrapper[19170]: I0313 01:23:43.641579 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.692597 master-0 kubenswrapper[19170]: I0313 01:23:43.645774 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:23:43.744351 master-0 kubenswrapper[19170]: I0313 01:23:43.737835 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" podStartSLOduration=3.771840874 podStartE2EDuration="9.7378131s" podCreationTimestamp="2026-03-13 01:23:34 +0000 UTC" firstStartedPulling="2026-03-13 01:23:36.29838279 +0000 UTC m=+277.106503790" lastFinishedPulling="2026-03-13 01:23:42.264355056 +0000 UTC m=+283.072476016" observedRunningTime="2026-03-13 01:23:43.711061277 +0000 UTC m=+284.519182237" watchObservedRunningTime="2026-03-13 01:23:43.7378131 +0000 UTC m=+284.545934060" Mar 13 01:23:43.803560 master-0 kubenswrapper[19170]: I0313 01:23:43.803507 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.803806 master-0 kubenswrapper[19170]: I0313 01:23:43.803590 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.803806 master-0 kubenswrapper[19170]: I0313 01:23:43.803621 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.803806 master-0 kubenswrapper[19170]: I0313 01:23:43.803758 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8q25\" (UniqueName: \"kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.803806 master-0 kubenswrapper[19170]: I0313 01:23:43.803794 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.804137 master-0 kubenswrapper[19170]: I0313 01:23:43.804110 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.804189 master-0 kubenswrapper[19170]: I0313 01:23:43.804144 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.905433 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.905500 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.905519 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.905555 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8q25\" (UniqueName: \"kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.905576 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.906379 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.906435 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.906457 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.907754 master-0 kubenswrapper[19170]: I0313 01:23:43.907051 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.910283 master-0 kubenswrapper[19170]: I0313 01:23:43.908599 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.910283 master-0 kubenswrapper[19170]: I0313 01:23:43.909407 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.920700 master-0 kubenswrapper[19170]: I0313 01:23:43.912312 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:43.920700 master-0 kubenswrapper[19170]: I0313 01:23:43.914308 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:44.613387 master-0 kubenswrapper[19170]: I0313 01:23:44.613318 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:44.729692 master-0 kubenswrapper[19170]: I0313 01:23:44.727345 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8q25\" (UniqueName: \"kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25\") pod \"console-6c969fc7db-l2cgv\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:44.918107 master-0 kubenswrapper[19170]: I0313 01:23:44.918029 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:45.012399 master-0 kubenswrapper[19170]: I0313 01:23:45.012353 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f6d58f575-sz96g" Mar 13 01:23:47.011995 master-0 kubenswrapper[19170]: I0313 01:23:47.010528 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:23:47.631978 master-0 kubenswrapper[19170]: I0313 01:23:47.631926 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"dae8fb90a04c0492d8c80b6470664d6e16407dc1200ec4ed708b6c6153efd5b0"} Mar 13 01:23:47.631978 master-0 kubenswrapper[19170]: I0313 01:23:47.631982 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"029088598edd61816752147fade1d6a1b05073be874c52f2faf6fc921fc76354"} Mar 13 01:23:47.632200 master-0 kubenswrapper[19170]: I0313 01:23:47.631996 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"d0dbf364f0ed210e0de5c8b77c654127d74cb688688fc9bb518fa0e352cb1ec6"} Mar 13 01:23:47.632200 master-0 kubenswrapper[19170]: I0313 01:23:47.632008 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"5e598f38cfae134dd28642325d5ab0ff3a8f66f74de5629b78717849efdbcd77"} Mar 13 01:23:47.632200 master-0 kubenswrapper[19170]: I0313 01:23:47.632019 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"0149b13d23d80ee7a7ffc619cfa7bd8d932a697a290b89e05b354fdecf9dd376"} Mar 13 01:23:47.632200 master-0 kubenswrapper[19170]: I0313 01:23:47.632030 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"57aeb432-8fc2-442d-9d95-0e86689ee923","Type":"ContainerStarted","Data":"60068ace9cc02714bce5e9e22b9361acee96e91b008b55fd35041c0d6f38e553"} Mar 13 01:23:47.634278 master-0 kubenswrapper[19170]: I0313 01:23:47.634221 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerStarted","Data":"4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36"} Mar 13 01:23:47.634355 master-0 kubenswrapper[19170]: I0313 01:23:47.634290 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerStarted","Data":"5774d7ee0c2fc3d6f3bef6c5ff141866de5419716d8c9089b9bad9faae4abb6f"} Mar 13 01:23:47.679066 master-0 kubenswrapper[19170]: I0313 01:23:47.678945 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.538497269 podStartE2EDuration="8.678928981s" podCreationTimestamp="2026-03-13 01:23:39 +0000 UTC" firstStartedPulling="2026-03-13 01:23:41.526157394 +0000 UTC m=+282.334278354" lastFinishedPulling="2026-03-13 01:23:46.666589106 +0000 UTC m=+287.474710066" observedRunningTime="2026-03-13 01:23:47.676344138 +0000 UTC m=+288.484465148" watchObservedRunningTime="2026-03-13 01:23:47.678928981 +0000 UTC m=+288.487049941" Mar 13 01:23:47.711092 master-0 kubenswrapper[19170]: I0313 01:23:47.711003 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c969fc7db-l2cgv" podStartSLOduration=4.710980274 podStartE2EDuration="4.710980274s" podCreationTimestamp="2026-03-13 01:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:23:47.705919201 +0000 UTC m=+288.514040181" watchObservedRunningTime="2026-03-13 01:23:47.710980274 +0000 UTC m=+288.519101234" Mar 13 01:23:49.383339 master-0 kubenswrapper[19170]: I0313 01:23:49.383253 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:23:51.829512 master-0 kubenswrapper[19170]: I0313 01:23:51.829452 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:23:51.830031 master-0 kubenswrapper[19170]: I0313 01:23:51.829528 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:23:52.357025 master-0 kubenswrapper[19170]: I0313 01:23:52.356971 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-c55c5ddb4-565wg"] Mar 13 01:23:52.358443 master-0 kubenswrapper[19170]: I0313 01:23:52.358381 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.360919 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.361672 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.362736 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.362847 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.364157 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.364301 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.364383 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 01:23:52.367254 master-0 kubenswrapper[19170]: I0313 01:23:52.364537 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 01:23:52.370403 master-0 kubenswrapper[19170]: I0313 01:23:52.368800 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 01:23:52.370403 master-0 kubenswrapper[19170]: I0313 01:23:52.368863 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-lkt5b" Mar 13 01:23:52.371080 master-0 kubenswrapper[19170]: I0313 01:23:52.371036 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 01:23:52.418485 master-0 kubenswrapper[19170]: I0313 01:23:52.416528 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 01:23:52.419983 master-0 kubenswrapper[19170]: I0313 01:23:52.419894 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 01:23:52.420073 master-0 kubenswrapper[19170]: I0313 01:23:52.419873 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 01:23:52.434306 master-0 kubenswrapper[19170]: I0313 01:23:52.433582 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c55c5ddb4-565wg"] Mar 13 01:23:52.559392 master-0 kubenswrapper[19170]: I0313 01:23:52.559217 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560243 master-0 kubenswrapper[19170]: I0313 01:23:52.560194 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560243 master-0 kubenswrapper[19170]: I0313 01:23:52.560228 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560435 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9cr\" (UniqueName: \"kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560456 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560477 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560499 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560535 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560578 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560595 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560616 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560652 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.560943 master-0 kubenswrapper[19170]: I0313 01:23:52.560687 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.662603 master-0 kubenswrapper[19170]: I0313 01:23:52.662503 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.662603 master-0 kubenswrapper[19170]: I0313 01:23:52.662573 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.662603 master-0 kubenswrapper[19170]: I0313 01:23:52.662596 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.662603 master-0 kubenswrapper[19170]: I0313 01:23:52.662612 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662653 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662688 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662713 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662747 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662766 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662798 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9cr\" (UniqueName: \"kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662814 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662833 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.663065 master-0 kubenswrapper[19170]: I0313 01:23:52.662854 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.664108 master-0 kubenswrapper[19170]: E0313 01:23:52.663752 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:52.664108 master-0 kubenswrapper[19170]: E0313 01:23:52.663865 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:53.163835852 +0000 UTC m=+293.971956912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:52.664290 master-0 kubenswrapper[19170]: I0313 01:23:52.664116 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.664290 master-0 kubenswrapper[19170]: E0313 01:23:52.664163 19170 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 13 01:23:52.664290 master-0 kubenswrapper[19170]: E0313 01:23:52.664246 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:53.164218373 +0000 UTC m=+293.972339363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : secret "v4-0-config-system-session" not found Mar 13 01:23:52.665671 master-0 kubenswrapper[19170]: I0313 01:23:52.665280 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.666310 master-0 kubenswrapper[19170]: I0313 01:23:52.666132 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.666310 master-0 kubenswrapper[19170]: I0313 01:23:52.666213 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.667669 master-0 kubenswrapper[19170]: I0313 01:23:52.667593 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.668043 master-0 kubenswrapper[19170]: I0313 01:23:52.668003 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.668685 master-0 kubenswrapper[19170]: I0313 01:23:52.668552 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.680251 master-0 kubenswrapper[19170]: I0313 01:23:52.680198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.680431 master-0 kubenswrapper[19170]: I0313 01:23:52.680368 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.680866 master-0 kubenswrapper[19170]: I0313 01:23:52.680821 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.709689 master-0 kubenswrapper[19170]: I0313 01:23:52.709595 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9cr\" (UniqueName: \"kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:52.954451 master-0 kubenswrapper[19170]: I0313 01:23:52.954268 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:23:52.955592 master-0 kubenswrapper[19170]: I0313 01:23:52.954665 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510" gracePeriod=30 Mar 13 01:23:52.955592 master-0 kubenswrapper[19170]: I0313 01:23:52.955524 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: E0313 01:23:52.955954 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: I0313 01:23:52.955980 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: E0313 01:23:52.956002 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: I0313 01:23:52.956016 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: I0313 01:23:52.956332 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: I0313 01:23:52.956397 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.956596 master-0 kubenswrapper[19170]: I0313 01:23:52.956430 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.957292 master-0 kubenswrapper[19170]: E0313 01:23:52.956695 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.957292 master-0 kubenswrapper[19170]: I0313 01:23:52.956711 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 01:23:52.958920 master-0 kubenswrapper[19170]: I0313 01:23:52.958858 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.070315 master-0 kubenswrapper[19170]: I0313 01:23:53.070237 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.070544 master-0 kubenswrapper[19170]: I0313 01:23:53.070379 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: I0313 01:23:53.173574 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: E0313 01:23:53.173823 19170 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: E0313 01:23:53.173924 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: I0313 01:23:53.173837 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: E0313 01:23:53.173925 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:54.17389997 +0000 UTC m=+294.982020960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : secret "v4-0-config-system-session" not found Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: E0313 01:23:53.174004 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:54.173984542 +0000 UTC m=+294.982105512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: I0313 01:23:53.174083 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.174978 master-0 kubenswrapper[19170]: I0313 01:23:53.174316 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.176719 master-0 kubenswrapper[19170]: I0313 01:23:53.175371 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.176719 master-0 kubenswrapper[19170]: I0313 01:23:53.175818 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/782d5ee25fbcc8a1fef3f1955932cf63-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"782d5ee25fbcc8a1fef3f1955932cf63\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.181877 master-0 kubenswrapper[19170]: I0313 01:23:53.181746 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:23:53.190115 master-0 kubenswrapper[19170]: I0313 01:23:53.189424 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 01:23:53.224078 master-0 kubenswrapper[19170]: I0313 01:23:53.223915 19170 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="8dcb48dd-c1e7-430b-a159-d3de9283aeff" Mar 13 01:23:53.378041 master-0 kubenswrapper[19170]: I0313 01:23:53.377999 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 13 01:23:53.378267 master-0 kubenswrapper[19170]: I0313 01:23:53.378175 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 13 01:23:53.378267 master-0 kubenswrapper[19170]: I0313 01:23:53.378229 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:53.378267 master-0 kubenswrapper[19170]: I0313 01:23:53.378257 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:53.378613 master-0 kubenswrapper[19170]: I0313 01:23:53.378578 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:53.378613 master-0 kubenswrapper[19170]: I0313 01:23:53.378608 19170 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:53.429577 master-0 kubenswrapper[19170]: I0313 01:23:53.429538 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 13 01:23:53.429958 master-0 kubenswrapper[19170]: I0313 01:23:53.429928 19170 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 13 01:23:53.443266 master-0 kubenswrapper[19170]: I0313 01:23:53.443197 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:23:53.443266 master-0 kubenswrapper[19170]: I0313 01:23:53.443258 19170 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="8dcb48dd-c1e7-430b-a159-d3de9283aeff" Mar 13 01:23:53.448677 master-0 kubenswrapper[19170]: I0313 01:23:53.448599 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 01:23:53.448677 master-0 kubenswrapper[19170]: I0313 01:23:53.448654 19170 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="8dcb48dd-c1e7-430b-a159-d3de9283aeff" Mar 13 01:23:53.467693 master-0 kubenswrapper[19170]: I0313 01:23:53.467657 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:53.503956 master-0 kubenswrapper[19170]: W0313 01:23:53.503873 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782d5ee25fbcc8a1fef3f1955932cf63.slice/crio-925f47dfe2469be61dba6680d02603f6f29f5ca4fe33aca61f52ae49284b6313 WatchSource:0}: Error finding container 925f47dfe2469be61dba6680d02603f6f29f5ca4fe33aca61f52ae49284b6313: Status 404 returned error can't find the container with id 925f47dfe2469be61dba6680d02603f6f29f5ca4fe33aca61f52ae49284b6313 Mar 13 01:23:53.702814 master-0 kubenswrapper[19170]: I0313 01:23:53.702757 19170 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510" exitCode=0 Mar 13 01:23:53.702939 master-0 kubenswrapper[19170]: I0313 01:23:53.702832 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 01:23:53.702990 master-0 kubenswrapper[19170]: I0313 01:23:53.702842 19170 scope.go:117] "RemoveContainer" containerID="1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510" Mar 13 01:23:53.704681 master-0 kubenswrapper[19170]: I0313 01:23:53.704553 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerStarted","Data":"925f47dfe2469be61dba6680d02603f6f29f5ca4fe33aca61f52ae49284b6313"} Mar 13 01:23:53.709261 master-0 kubenswrapper[19170]: I0313 01:23:53.709228 19170 generic.go:334] "Generic (PLEG): container finished" podID="7801d684-f91c-4f55-93e7-04a104759c08" containerID="71856a725dc2facf87c076cbf57fe15640bc593db7bac2e62004e3375cc92a3a" exitCode=0 Mar 13 01:23:53.709261 master-0 kubenswrapper[19170]: I0313 01:23:53.709266 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7801d684-f91c-4f55-93e7-04a104759c08","Type":"ContainerDied","Data":"71856a725dc2facf87c076cbf57fe15640bc593db7bac2e62004e3375cc92a3a"} Mar 13 01:23:53.753840 master-0 kubenswrapper[19170]: I0313 01:23:53.753772 19170 scope.go:117] "RemoveContainer" containerID="d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9" Mar 13 01:23:53.780144 master-0 kubenswrapper[19170]: I0313 01:23:53.780089 19170 scope.go:117] "RemoveContainer" containerID="1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510" Mar 13 01:23:53.780565 master-0 kubenswrapper[19170]: E0313 01:23:53.780513 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510\": container with ID starting with 1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510 not found: ID does not exist" containerID="1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510" Mar 13 01:23:53.780626 master-0 kubenswrapper[19170]: I0313 01:23:53.780568 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510"} err="failed to get container status \"1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510\": rpc error: code = NotFound desc = could not find container \"1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510\": container with ID starting with 1c8e674976731a2c93a50660b066eae9c4fd5ada4c595229db11941c5c7b1510 not found: ID does not exist" Mar 13 01:23:53.780626 master-0 kubenswrapper[19170]: I0313 01:23:53.780597 19170 scope.go:117] "RemoveContainer" containerID="d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9" Mar 13 01:23:53.780920 master-0 kubenswrapper[19170]: E0313 01:23:53.780889 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9\": container with ID starting with d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9 not found: ID does not exist" containerID="d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9" Mar 13 01:23:53.780920 master-0 kubenswrapper[19170]: I0313 01:23:53.780909 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9"} err="failed to get container status \"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9\": rpc error: code = NotFound desc = could not find container \"d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9\": container with ID starting with d71b1cc6761a45082d16174057a8ba98300f0e56574d0c45301dacf3ba713ba9 not found: ID does not exist" Mar 13 01:23:54.193912 master-0 kubenswrapper[19170]: I0313 01:23:54.193824 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:54.194880 master-0 kubenswrapper[19170]: I0313 01:23:54.193952 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:54.194880 master-0 kubenswrapper[19170]: E0313 01:23:54.194069 19170 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 13 01:23:54.194880 master-0 kubenswrapper[19170]: E0313 01:23:54.194165 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:54.194880 master-0 kubenswrapper[19170]: E0313 01:23:54.194214 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:56.194171768 +0000 UTC m=+297.002292778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : secret "v4-0-config-system-session" not found Mar 13 01:23:54.194880 master-0 kubenswrapper[19170]: E0313 01:23:54.194259 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:23:56.1942332 +0000 UTC m=+297.002354200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:54.722500 master-0 kubenswrapper[19170]: I0313 01:23:54.722405 19170 generic.go:334] "Generic (PLEG): container finished" podID="782d5ee25fbcc8a1fef3f1955932cf63" containerID="b97c87df2c6960bdd1ae95d8e2fbeb87460406e84b751b92be07e6ff3fcbb92f" exitCode=0 Mar 13 01:23:54.722971 master-0 kubenswrapper[19170]: I0313 01:23:54.722525 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerDied","Data":"b97c87df2c6960bdd1ae95d8e2fbeb87460406e84b751b92be07e6ff3fcbb92f"} Mar 13 01:23:54.923682 master-0 kubenswrapper[19170]: I0313 01:23:54.918350 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:54.923682 master-0 kubenswrapper[19170]: I0313 01:23:54.918427 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:23:54.923682 master-0 kubenswrapper[19170]: I0313 01:23:54.920361 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:23:54.923682 master-0 kubenswrapper[19170]: I0313 01:23:54.920415 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:23:55.153598 master-0 kubenswrapper[19170]: I0313 01:23:55.153308 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.321569 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access\") pod \"7801d684-f91c-4f55-93e7-04a104759c08\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.321705 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir\") pod \"7801d684-f91c-4f55-93e7-04a104759c08\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.321777 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7801d684-f91c-4f55-93e7-04a104759c08" (UID: "7801d684-f91c-4f55-93e7-04a104759c08"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.321815 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock\") pod \"7801d684-f91c-4f55-93e7-04a104759c08\" (UID: \"7801d684-f91c-4f55-93e7-04a104759c08\") " Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.321972 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock" (OuterVolumeSpecName: "var-lock") pod "7801d684-f91c-4f55-93e7-04a104759c08" (UID: "7801d684-f91c-4f55-93e7-04a104759c08"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.322329 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:55.324143 master-0 kubenswrapper[19170]: I0313 01:23:55.322345 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7801d684-f91c-4f55-93e7-04a104759c08-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:55.334913 master-0 kubenswrapper[19170]: I0313 01:23:55.334883 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7801d684-f91c-4f55-93e7-04a104759c08" (UID: "7801d684-f91c-4f55-93e7-04a104759c08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:23:55.423970 master-0 kubenswrapper[19170]: I0313 01:23:55.423724 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7801d684-f91c-4f55-93e7-04a104759c08-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:55.737154 master-0 kubenswrapper[19170]: I0313 01:23:55.737098 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerStarted","Data":"41e63bc2063ea4552388c81d47d54a419737fa5aebc04beebe656e179825f66f"} Mar 13 01:23:55.737154 master-0 kubenswrapper[19170]: I0313 01:23:55.737145 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerStarted","Data":"770356e4ae02115731f6cd66a447c5876d4c2f8ece672d11699b3d9076755f60"} Mar 13 01:23:55.737154 master-0 kubenswrapper[19170]: I0313 01:23:55.737156 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerStarted","Data":"44a5d925ffd004246a4349660cad574513d9d5098376b6ab6e4cf763813f9a22"} Mar 13 01:23:55.738062 master-0 kubenswrapper[19170]: I0313 01:23:55.738026 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:23:55.742044 master-0 kubenswrapper[19170]: I0313 01:23:55.742000 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-7-master-0" event={"ID":"7801d684-f91c-4f55-93e7-04a104759c08","Type":"ContainerDied","Data":"16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31"} Mar 13 01:23:55.742044 master-0 kubenswrapper[19170]: I0313 01:23:55.742040 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16e7f0e9baa0723b89d4d6650dca26b5c8e25f54377ceaf3108474cd2c15cd31" Mar 13 01:23:55.742169 master-0 kubenswrapper[19170]: I0313 01:23:55.742128 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-7-master-0" Mar 13 01:23:55.759282 master-0 kubenswrapper[19170]: I0313 01:23:55.759188 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.759166661 podStartE2EDuration="2.759166661s" podCreationTimestamp="2026-03-13 01:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:23:55.754620313 +0000 UTC m=+296.562741273" watchObservedRunningTime="2026-03-13 01:23:55.759166661 +0000 UTC m=+296.567287641" Mar 13 01:23:56.237556 master-0 kubenswrapper[19170]: I0313 01:23:56.237475 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:56.237556 master-0 kubenswrapper[19170]: I0313 01:23:56.237544 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:56.237993 master-0 kubenswrapper[19170]: E0313 01:23:56.237733 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:56.237993 master-0 kubenswrapper[19170]: E0313 01:23:56.237812 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:24:00.237791422 +0000 UTC m=+301.045912392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:23:56.241299 master-0 kubenswrapper[19170]: I0313 01:23:56.241232 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:57.690544 master-0 kubenswrapper[19170]: I0313 01:23:57.690456 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:57.690544 master-0 kubenswrapper[19170]: I0313 01:23:57.690539 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:23:59.378169 master-0 kubenswrapper[19170]: I0313 01:23:59.378042 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-c55c5ddb4-565wg"] Mar 13 01:23:59.382455 master-0 kubenswrapper[19170]: E0313 01:23:59.382400 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[v4-0-config-system-cliconfig], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" podUID="a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" Mar 13 01:23:59.411767 master-0 kubenswrapper[19170]: I0313 01:23:59.408902 19170 kubelet.go:1505] "Image garbage collection succeeded" Mar 13 01:23:59.779208 master-0 kubenswrapper[19170]: I0313 01:23:59.779143 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:59.793040 master-0 kubenswrapper[19170]: I0313 01:23:59.792976 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:23:59.808624 master-0 kubenswrapper[19170]: I0313 01:23:59.808546 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.808624 master-0 kubenswrapper[19170]: I0313 01:23:59.808628 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.808988 master-0 kubenswrapper[19170]: I0313 01:23:59.808762 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.808988 master-0 kubenswrapper[19170]: I0313 01:23:59.808819 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.808988 master-0 kubenswrapper[19170]: I0313 01:23:59.808900 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.808988 master-0 kubenswrapper[19170]: I0313 01:23:59.808953 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z9cr\" (UniqueName: \"kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.809353 master-0 kubenswrapper[19170]: I0313 01:23:59.808996 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.809353 master-0 kubenswrapper[19170]: I0313 01:23:59.809044 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.809353 master-0 kubenswrapper[19170]: I0313 01:23:59.809124 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.810217 master-0 kubenswrapper[19170]: I0313 01:23:59.809179 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.810553 master-0 kubenswrapper[19170]: I0313 01:23:59.809187 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:23:59.810553 master-0 kubenswrapper[19170]: I0313 01:23:59.809752 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:59.810846 master-0 kubenswrapper[19170]: I0313 01:23:59.810123 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:59.810846 master-0 kubenswrapper[19170]: I0313 01:23:59.810592 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:23:59.810846 master-0 kubenswrapper[19170]: I0313 01:23:59.810609 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.810846 master-0 kubenswrapper[19170]: I0313 01:23:59.810788 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login\") pod \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " Mar 13 01:23:59.811458 master-0 kubenswrapper[19170]: I0313 01:23:59.811399 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.811589 master-0 kubenswrapper[19170]: I0313 01:23:59.811454 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.811589 master-0 kubenswrapper[19170]: I0313 01:23:59.811486 19170 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.811589 master-0 kubenswrapper[19170]: I0313 01:23:59.811516 19170 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.815131 master-0 kubenswrapper[19170]: I0313 01:23:59.814164 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.815131 master-0 kubenswrapper[19170]: I0313 01:23:59.814614 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.815131 master-0 kubenswrapper[19170]: I0313 01:23:59.814992 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.817911 master-0 kubenswrapper[19170]: I0313 01:23:59.816220 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.817911 master-0 kubenswrapper[19170]: I0313 01:23:59.816384 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.817911 master-0 kubenswrapper[19170]: I0313 01:23:59.816597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.817911 master-0 kubenswrapper[19170]: I0313 01:23:59.816754 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr" (OuterVolumeSpecName: "kube-api-access-2z9cr") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "kube-api-access-2z9cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:23:59.819324 master-0 kubenswrapper[19170]: I0313 01:23:59.819258 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:23:59.913052 master-0 kubenswrapper[19170]: I0313 01:23:59.912981 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913052 master-0 kubenswrapper[19170]: I0313 01:23:59.913032 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z9cr\" (UniqueName: \"kubernetes.io/projected/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-kube-api-access-2z9cr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913052 master-0 kubenswrapper[19170]: I0313 01:23:59.913054 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913412 master-0 kubenswrapper[19170]: I0313 01:23:59.913075 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913412 master-0 kubenswrapper[19170]: I0313 01:23:59.913096 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913412 master-0 kubenswrapper[19170]: I0313 01:23:59.913116 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913412 master-0 kubenswrapper[19170]: I0313 01:23:59.913135 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:23:59.913412 master-0 kubenswrapper[19170]: I0313 01:23:59.913157 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:00.321485 master-0 kubenswrapper[19170]: I0313 01:24:00.321394 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c55c5ddb4-565wg\" (UID: \"a1453e2c-1b5d-435b-b6d4-d2cdb99b5608\") " pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:24:00.321749 master-0 kubenswrapper[19170]: E0313 01:24:00.321657 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:24:00.321859 master-0 kubenswrapper[19170]: E0313 01:24:00.321758 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig podName:a1453e2c-1b5d-435b-b6d4-d2cdb99b5608 nodeName:}" failed. No retries permitted until 2026-03-13 01:24:08.321731936 +0000 UTC m=+309.129852926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig") pod "oauth-openshift-c55c5ddb4-565wg" (UID: "a1453e2c-1b5d-435b-b6d4-d2cdb99b5608") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:24:00.788861 master-0 kubenswrapper[19170]: I0313 01:24:00.788806 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c55c5ddb4-565wg" Mar 13 01:24:00.865442 master-0 kubenswrapper[19170]: I0313 01:24:00.865356 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:24:00.866104 master-0 kubenswrapper[19170]: E0313 01:24:00.866065 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7801d684-f91c-4f55-93e7-04a104759c08" containerName="installer" Mar 13 01:24:00.866183 master-0 kubenswrapper[19170]: I0313 01:24:00.866105 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="7801d684-f91c-4f55-93e7-04a104759c08" containerName="installer" Mar 13 01:24:00.866551 master-0 kubenswrapper[19170]: I0313 01:24:00.866516 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="7801d684-f91c-4f55-93e7-04a104759c08" containerName="installer" Mar 13 01:24:00.868307 master-0 kubenswrapper[19170]: I0313 01:24:00.867698 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.875485 master-0 kubenswrapper[19170]: I0313 01:24:00.875177 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-c55c5ddb4-565wg"] Mar 13 01:24:00.886086 master-0 kubenswrapper[19170]: I0313 01:24:00.886015 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 01:24:00.886375 master-0 kubenswrapper[19170]: I0313 01:24:00.886342 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 01:24:00.886659 master-0 kubenswrapper[19170]: I0313 01:24:00.886619 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 01:24:00.886961 master-0 kubenswrapper[19170]: I0313 01:24:00.886929 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-lkt5b" Mar 13 01:24:00.889456 master-0 kubenswrapper[19170]: I0313 01:24:00.889393 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-c55c5ddb4-565wg"] Mar 13 01:24:00.900340 master-0 kubenswrapper[19170]: I0313 01:24:00.900278 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.902341 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.902754 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.903296 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.903448 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.903586 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 01:24:00.904387 master-0 kubenswrapper[19170]: I0313 01:24:00.903765 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 01:24:00.914712 master-0 kubenswrapper[19170]: I0313 01:24:00.914612 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:24:00.915076 master-0 kubenswrapper[19170]: I0313 01:24:00.915006 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 01:24:00.924954 master-0 kubenswrapper[19170]: I0313 01:24:00.924897 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 01:24:00.924954 master-0 kubenswrapper[19170]: I0313 01:24:00.924938 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 01:24:00.945202 master-0 kubenswrapper[19170]: I0313 01:24:00.945144 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945413 master-0 kubenswrapper[19170]: I0313 01:24:00.945232 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945413 master-0 kubenswrapper[19170]: I0313 01:24:00.945289 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945413 master-0 kubenswrapper[19170]: I0313 01:24:00.945344 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945579 master-0 kubenswrapper[19170]: I0313 01:24:00.945423 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945579 master-0 kubenswrapper[19170]: I0313 01:24:00.945545 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945689 master-0 kubenswrapper[19170]: I0313 01:24:00.945591 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945689 master-0 kubenswrapper[19170]: I0313 01:24:00.945656 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945779 master-0 kubenswrapper[19170]: I0313 01:24:00.945723 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945779 master-0 kubenswrapper[19170]: I0313 01:24:00.945759 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945854 master-0 kubenswrapper[19170]: I0313 01:24:00.945801 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945906 master-0 kubenswrapper[19170]: I0313 01:24:00.945853 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55t7g\" (UniqueName: \"kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.945983 master-0 kubenswrapper[19170]: I0313 01:24:00.945951 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:00.946309 master-0 kubenswrapper[19170]: I0313 01:24:00.946037 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:01.047062 master-0 kubenswrapper[19170]: I0313 01:24:01.046925 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047280 master-0 kubenswrapper[19170]: E0313 01:24:01.047113 19170 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 13 01:24:01.047280 master-0 kubenswrapper[19170]: I0313 01:24:01.047128 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047280 master-0 kubenswrapper[19170]: E0313 01:24:01.047216 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig podName:dc96262c-7c20-490b-b90e-d1fba7a26a46 nodeName:}" failed. No retries permitted until 2026-03-13 01:24:01.5471927 +0000 UTC m=+302.355313670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig") pod "oauth-openshift-db987b46b-l4pxc" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46") : configmap "v4-0-config-system-cliconfig" not found Mar 13 01:24:01.047280 master-0 kubenswrapper[19170]: I0313 01:24:01.047236 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047280 master-0 kubenswrapper[19170]: I0313 01:24:01.047275 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047623 master-0 kubenswrapper[19170]: I0313 01:24:01.047311 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047623 master-0 kubenswrapper[19170]: I0313 01:24:01.047350 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047623 master-0 kubenswrapper[19170]: I0313 01:24:01.047580 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047850 master-0 kubenswrapper[19170]: I0313 01:24:01.047666 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047850 master-0 kubenswrapper[19170]: I0313 01:24:01.047706 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.047850 master-0 kubenswrapper[19170]: I0313 01:24:01.047749 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.048345 master-0 kubenswrapper[19170]: I0313 01:24:01.048144 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.048345 master-0 kubenswrapper[19170]: I0313 01:24:01.048185 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.048345 master-0 kubenswrapper[19170]: I0313 01:24:01.048255 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.048345 master-0 kubenswrapper[19170]: I0313 01:24:01.048338 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55t7g\" (UniqueName: \"kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.049127 master-0 kubenswrapper[19170]: I0313 01:24:01.048659 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.049127 master-0 kubenswrapper[19170]: I0313 01:24:01.048983 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.049127 master-0 kubenswrapper[19170]: I0313 01:24:01.049026 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.051161 master-0 kubenswrapper[19170]: I0313 01:24:01.051114 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.051462 master-0 kubenswrapper[19170]: I0313 01:24:01.051422 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.051674 master-0 kubenswrapper[19170]: I0313 01:24:01.051614 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.051674 master-0 kubenswrapper[19170]: I0313 01:24:01.051669 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.052624 master-0 kubenswrapper[19170]: I0313 01:24:01.052582 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.054151 master-0 kubenswrapper[19170]: I0313 01:24:01.053669 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.054151 master-0 kubenswrapper[19170]: I0313 01:24:01.054050 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.068441 master-0 kubenswrapper[19170]: I0313 01:24:01.068396 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55t7g\" (UniqueName: \"kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.437138 master-0 kubenswrapper[19170]: I0313 01:24:01.437086 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1453e2c-1b5d-435b-b6d4-d2cdb99b5608" path="/var/lib/kubelet/pods/a1453e2c-1b5d-435b-b6d4-d2cdb99b5608/volumes" Mar 13 01:24:01.559452 master-0 kubenswrapper[19170]: I0313 01:24:01.559340 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.560586 master-0 kubenswrapper[19170]: I0313 01:24:01.560513 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") pod \"oauth-openshift-db987b46b-l4pxc\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.581337 master-0 kubenswrapper[19170]: I0313 01:24:01.581262 19170 scope.go:117] "RemoveContainer" containerID="ae1c74ac713339ebe951cea485ddb317986dccb644eb4d3021ce0d21c709fe41" Mar 13 01:24:01.808756 master-0 kubenswrapper[19170]: I0313 01:24:01.808587 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:01.828955 master-0 kubenswrapper[19170]: I0313 01:24:01.828875 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:01.829125 master-0 kubenswrapper[19170]: I0313 01:24:01.828948 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:02.366919 master-0 kubenswrapper[19170]: I0313 01:24:02.366874 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:24:02.367576 master-0 kubenswrapper[19170]: W0313 01:24:02.367530 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc96262c_7c20_490b_b90e_d1fba7a26a46.slice/crio-06b33bfe8f677cbc17516783d3a2ea455d65925f812456d7b8d09c94a01f4a42 WatchSource:0}: Error finding container 06b33bfe8f677cbc17516783d3a2ea455d65925f812456d7b8d09c94a01f4a42: Status 404 returned error can't find the container with id 06b33bfe8f677cbc17516783d3a2ea455d65925f812456d7b8d09c94a01f4a42 Mar 13 01:24:02.813034 master-0 kubenswrapper[19170]: I0313 01:24:02.812954 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" event={"ID":"dc96262c-7c20-490b-b90e-d1fba7a26a46","Type":"ContainerStarted","Data":"06b33bfe8f677cbc17516783d3a2ea455d65925f812456d7b8d09c94a01f4a42"} Mar 13 01:24:04.148033 master-0 kubenswrapper[19170]: I0313 01:24:04.147949 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 01:24:04.149355 master-0 kubenswrapper[19170]: I0313 01:24:04.149329 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.154983 master-0 kubenswrapper[19170]: I0313 01:24:04.154349 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 01:24:04.154983 master-0 kubenswrapper[19170]: I0313 01:24:04.154479 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-lnd87" Mar 13 01:24:04.158136 master-0 kubenswrapper[19170]: I0313 01:24:04.158095 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 01:24:04.214054 master-0 kubenswrapper[19170]: I0313 01:24:04.213989 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.214392 master-0 kubenswrapper[19170]: I0313 01:24:04.214365 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.214507 master-0 kubenswrapper[19170]: I0313 01:24:04.214472 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.315621 master-0 kubenswrapper[19170]: I0313 01:24:04.315565 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.315829 master-0 kubenswrapper[19170]: I0313 01:24:04.315669 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.315829 master-0 kubenswrapper[19170]: I0313 01:24:04.315719 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.315829 master-0 kubenswrapper[19170]: I0313 01:24:04.315808 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.316113 master-0 kubenswrapper[19170]: I0313 01:24:04.316046 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.336140 master-0 kubenswrapper[19170]: I0313 01:24:04.336078 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.486492 master-0 kubenswrapper[19170]: I0313 01:24:04.486428 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:04.920062 master-0 kubenswrapper[19170]: I0313 01:24:04.919925 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:04.920062 master-0 kubenswrapper[19170]: I0313 01:24:04.920009 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:05.017676 master-0 kubenswrapper[19170]: I0313 01:24:05.014836 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 01:24:05.025241 master-0 kubenswrapper[19170]: W0313 01:24:05.025125 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda8c9ec23_a5af_44a1_859a_86629153ae8c.slice/crio-0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d WatchSource:0}: Error finding container 0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d: Status 404 returned error can't find the container with id 0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d Mar 13 01:24:05.844056 master-0 kubenswrapper[19170]: I0313 01:24:05.843821 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"a8c9ec23-a5af-44a1-859a-86629153ae8c","Type":"ContainerStarted","Data":"ede426974ce9adbfb3642c936de09459b7d89a52b0e5577a2c374ff413fc084f"} Mar 13 01:24:05.844056 master-0 kubenswrapper[19170]: I0313 01:24:05.843948 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"a8c9ec23-a5af-44a1-859a-86629153ae8c","Type":"ContainerStarted","Data":"0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d"} Mar 13 01:24:05.846353 master-0 kubenswrapper[19170]: I0313 01:24:05.846297 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" event={"ID":"dc96262c-7c20-490b-b90e-d1fba7a26a46","Type":"ContainerStarted","Data":"2c4aa5af4a8c63a3c9c2d75fe3e26ec69d431597f1535a695a705d218c7f603e"} Mar 13 01:24:05.846798 master-0 kubenswrapper[19170]: I0313 01:24:05.846711 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:05.855775 master-0 kubenswrapper[19170]: I0313 01:24:05.855714 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:05.880597 master-0 kubenswrapper[19170]: I0313 01:24:05.877479 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=1.877445126 podStartE2EDuration="1.877445126s" podCreationTimestamp="2026-03-13 01:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:24:05.871148199 +0000 UTC m=+306.679269189" watchObservedRunningTime="2026-03-13 01:24:05.877445126 +0000 UTC m=+306.685566126" Mar 13 01:24:05.912783 master-0 kubenswrapper[19170]: I0313 01:24:05.912684 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" podStartSLOduration=4.781802518 podStartE2EDuration="6.912667458s" podCreationTimestamp="2026-03-13 01:23:59 +0000 UTC" firstStartedPulling="2026-03-13 01:24:02.370393052 +0000 UTC m=+303.178514012" lastFinishedPulling="2026-03-13 01:24:04.501257952 +0000 UTC m=+305.309378952" observedRunningTime="2026-03-13 01:24:05.909385156 +0000 UTC m=+306.717506136" watchObservedRunningTime="2026-03-13 01:24:05.912667458 +0000 UTC m=+306.720788428" Mar 13 01:24:08.630522 master-0 kubenswrapper[19170]: I0313 01:24:08.630334 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-575758dfc4-r6mb4" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" containerID="cri-o://8f8d1117fb3a13d425005f9269dde217d7bd9db550efcafcd9b5d352dea722d9" gracePeriod=15 Mar 13 01:24:08.874669 master-0 kubenswrapper[19170]: I0313 01:24:08.874589 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-575758dfc4-r6mb4_78354ba8-21d5-4774-aa7f-8c72fee1995d/console/0.log" Mar 13 01:24:08.874914 master-0 kubenswrapper[19170]: I0313 01:24:08.874697 19170 generic.go:334] "Generic (PLEG): container finished" podID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerID="8f8d1117fb3a13d425005f9269dde217d7bd9db550efcafcd9b5d352dea722d9" exitCode=2 Mar 13 01:24:08.874914 master-0 kubenswrapper[19170]: I0313 01:24:08.874743 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-575758dfc4-r6mb4" event={"ID":"78354ba8-21d5-4774-aa7f-8c72fee1995d","Type":"ContainerDied","Data":"8f8d1117fb3a13d425005f9269dde217d7bd9db550efcafcd9b5d352dea722d9"} Mar 13 01:24:09.236123 master-0 kubenswrapper[19170]: I0313 01:24:09.236080 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-575758dfc4-r6mb4_78354ba8-21d5-4774-aa7f-8c72fee1995d/console/0.log" Mar 13 01:24:09.236299 master-0 kubenswrapper[19170]: I0313 01:24:09.236158 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:24:09.328764 master-0 kubenswrapper[19170]: I0313 01:24:09.328709 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.328957 master-0 kubenswrapper[19170]: I0313 01:24:09.328867 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.328957 master-0 kubenswrapper[19170]: I0313 01:24:09.328933 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.329019 master-0 kubenswrapper[19170]: I0313 01:24:09.328974 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmjlf\" (UniqueName: \"kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.329100 master-0 kubenswrapper[19170]: I0313 01:24:09.329079 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.329139 master-0 kubenswrapper[19170]: I0313 01:24:09.329116 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.329223 master-0 kubenswrapper[19170]: I0313 01:24:09.329201 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca\") pod \"78354ba8-21d5-4774-aa7f-8c72fee1995d\" (UID: \"78354ba8-21d5-4774-aa7f-8c72fee1995d\") " Mar 13 01:24:09.345663 master-0 kubenswrapper[19170]: I0313 01:24:09.342568 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:09.345663 master-0 kubenswrapper[19170]: I0313 01:24:09.343185 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca" (OuterVolumeSpecName: "service-ca") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:09.345663 master-0 kubenswrapper[19170]: I0313 01:24:09.343815 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:09.345663 master-0 kubenswrapper[19170]: I0313 01:24:09.345154 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config" (OuterVolumeSpecName: "console-config") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:09.349656 master-0 kubenswrapper[19170]: I0313 01:24:09.346175 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf" (OuterVolumeSpecName: "kube-api-access-tmjlf") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "kube-api-access-tmjlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:24:09.363952 master-0 kubenswrapper[19170]: I0313 01:24:09.362738 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:09.368732 master-0 kubenswrapper[19170]: I0313 01:24:09.368705 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "78354ba8-21d5-4774-aa7f-8c72fee1995d" (UID: "78354ba8-21d5-4774-aa7f-8c72fee1995d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431286 19170 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431324 19170 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431338 19170 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431351 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmjlf\" (UniqueName: \"kubernetes.io/projected/78354ba8-21d5-4774-aa7f-8c72fee1995d-kube-api-access-tmjlf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431363 19170 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431376 19170 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.431870 master-0 kubenswrapper[19170]: I0313 01:24:09.431387 19170 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/78354ba8-21d5-4774-aa7f-8c72fee1995d-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:09.890294 master-0 kubenswrapper[19170]: I0313 01:24:09.890156 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-575758dfc4-r6mb4_78354ba8-21d5-4774-aa7f-8c72fee1995d/console/0.log" Mar 13 01:24:09.890294 master-0 kubenswrapper[19170]: I0313 01:24:09.890254 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-575758dfc4-r6mb4" event={"ID":"78354ba8-21d5-4774-aa7f-8c72fee1995d","Type":"ContainerDied","Data":"90bd91b509cf7d4f07d52aa508350d4502f27f957a22cb734d93d41581720044"} Mar 13 01:24:09.890858 master-0 kubenswrapper[19170]: I0313 01:24:09.890320 19170 scope.go:117] "RemoveContainer" containerID="8f8d1117fb3a13d425005f9269dde217d7bd9db550efcafcd9b5d352dea722d9" Mar 13 01:24:09.890858 master-0 kubenswrapper[19170]: I0313 01:24:09.890393 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-575758dfc4-r6mb4" Mar 13 01:24:09.921436 master-0 kubenswrapper[19170]: I0313 01:24:09.921372 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:24:09.930738 master-0 kubenswrapper[19170]: I0313 01:24:09.930539 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-575758dfc4-r6mb4"] Mar 13 01:24:11.438736 master-0 kubenswrapper[19170]: I0313 01:24:11.438663 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" path="/var/lib/kubelet/pods/78354ba8-21d5-4774-aa7f-8c72fee1995d/volumes" Mar 13 01:24:11.773591 master-0 kubenswrapper[19170]: I0313 01:24:11.773428 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:24:11.868700 master-0 kubenswrapper[19170]: I0313 01:24:11.866107 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:11.868700 master-0 kubenswrapper[19170]: I0313 01:24:11.866181 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:14.919842 master-0 kubenswrapper[19170]: I0313 01:24:14.919759 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:14.920543 master-0 kubenswrapper[19170]: I0313 01:24:14.919859 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:17.699817 master-0 kubenswrapper[19170]: I0313 01:24:17.699717 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:24:17.706378 master-0 kubenswrapper[19170]: I0313 01:24:17.706274 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5dcbdc8c89-87sck" Mar 13 01:24:18.765699 master-0 kubenswrapper[19170]: I0313 01:24:18.765606 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:24:18.766541 master-0 kubenswrapper[19170]: E0313 01:24:18.766128 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" Mar 13 01:24:18.766541 master-0 kubenswrapper[19170]: I0313 01:24:18.766153 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" Mar 13 01:24:18.766541 master-0 kubenswrapper[19170]: I0313 01:24:18.766423 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="78354ba8-21d5-4774-aa7f-8c72fee1995d" containerName="console" Mar 13 01:24:18.768232 master-0 kubenswrapper[19170]: I0313 01:24:18.768167 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.771429 master-0 kubenswrapper[19170]: I0313 01:24:18.771347 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k6d2j" Mar 13 01:24:18.772970 master-0 kubenswrapper[19170]: I0313 01:24:18.772916 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 01:24:18.792228 master-0 kubenswrapper[19170]: I0313 01:24:18.792174 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:24:18.808618 master-0 kubenswrapper[19170]: I0313 01:24:18.808548 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.808618 master-0 kubenswrapper[19170]: I0313 01:24:18.808625 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.808927 master-0 kubenswrapper[19170]: I0313 01:24:18.808739 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.911804 master-0 kubenswrapper[19170]: I0313 01:24:18.910709 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.911804 master-0 kubenswrapper[19170]: I0313 01:24:18.910824 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.911804 master-0 kubenswrapper[19170]: I0313 01:24:18.910845 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.911804 master-0 kubenswrapper[19170]: I0313 01:24:18.910926 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.911804 master-0 kubenswrapper[19170]: I0313 01:24:18.911083 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:18.928740 master-0 kubenswrapper[19170]: I0313 01:24:18.926164 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:19.106714 master-0 kubenswrapper[19170]: I0313 01:24:19.106527 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:19.626766 master-0 kubenswrapper[19170]: I0313 01:24:19.626731 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:24:19.633025 master-0 kubenswrapper[19170]: W0313 01:24:19.633002 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcbbab052_3cba_476d_a74f_edb7f738a73d.slice/crio-6f924cf47e3cfd7a4a0fd0f3e73b8ac47cb23dd4bd3db05f5031dd1a24065866 WatchSource:0}: Error finding container 6f924cf47e3cfd7a4a0fd0f3e73b8ac47cb23dd4bd3db05f5031dd1a24065866: Status 404 returned error can't find the container with id 6f924cf47e3cfd7a4a0fd0f3e73b8ac47cb23dd4bd3db05f5031dd1a24065866 Mar 13 01:24:19.984966 master-0 kubenswrapper[19170]: I0313 01:24:19.984897 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cbbab052-3cba-476d-a74f-edb7f738a73d","Type":"ContainerStarted","Data":"6f924cf47e3cfd7a4a0fd0f3e73b8ac47cb23dd4bd3db05f5031dd1a24065866"} Mar 13 01:24:20.994401 master-0 kubenswrapper[19170]: I0313 01:24:20.994313 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cbbab052-3cba-476d-a74f-edb7f738a73d","Type":"ContainerStarted","Data":"baa48640462f0be2da5e515e74f8d651ac9002e75301b39eb1cde0cd3495eec2"} Mar 13 01:24:21.015487 master-0 kubenswrapper[19170]: I0313 01:24:21.015397 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=3.015370003 podStartE2EDuration="3.015370003s" podCreationTimestamp="2026-03-13 01:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:24:21.015104425 +0000 UTC m=+321.823225405" watchObservedRunningTime="2026-03-13 01:24:21.015370003 +0000 UTC m=+321.823490993" Mar 13 01:24:21.829630 master-0 kubenswrapper[19170]: I0313 01:24:21.829531 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:21.829941 master-0 kubenswrapper[19170]: I0313 01:24:21.829707 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:24.918980 master-0 kubenswrapper[19170]: I0313 01:24:24.918902 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:24.920080 master-0 kubenswrapper[19170]: I0313 01:24:24.919008 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:31.104981 master-0 kubenswrapper[19170]: I0313 01:24:31.104906 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 13 01:24:31.107321 master-0 kubenswrapper[19170]: I0313 01:24:31.107279 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.109553 master-0 kubenswrapper[19170]: I0313 01:24:31.109508 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:24:31.109843 master-0 kubenswrapper[19170]: I0313 01:24:31.109815 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:24:31.117391 master-0 kubenswrapper[19170]: I0313 01:24:31.117331 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 13 01:24:31.232610 master-0 kubenswrapper[19170]: I0313 01:24:31.232534 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.232610 master-0 kubenswrapper[19170]: I0313 01:24:31.232620 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.232902 master-0 kubenswrapper[19170]: I0313 01:24:31.232838 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.335178 master-0 kubenswrapper[19170]: I0313 01:24:31.335106 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.335428 master-0 kubenswrapper[19170]: I0313 01:24:31.335308 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.335428 master-0 kubenswrapper[19170]: I0313 01:24:31.335352 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.335594 master-0 kubenswrapper[19170]: I0313 01:24:31.335531 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.335594 master-0 kubenswrapper[19170]: I0313 01:24:31.335579 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.360879 master-0 kubenswrapper[19170]: I0313 01:24:31.360754 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access\") pod \"installer-5-master-0\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.449841 master-0 kubenswrapper[19170]: I0313 01:24:31.449714 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:24:31.829258 master-0 kubenswrapper[19170]: I0313 01:24:31.829174 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:31.829989 master-0 kubenswrapper[19170]: I0313 01:24:31.829349 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:31.940677 master-0 kubenswrapper[19170]: I0313 01:24:31.940578 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 13 01:24:31.949726 master-0 kubenswrapper[19170]: W0313 01:24:31.949596 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod349df320_5872_4352_8495_d7a5f9c4fc51.slice/crio-fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a WatchSource:0}: Error finding container fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a: Status 404 returned error can't find the container with id fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a Mar 13 01:24:32.096820 master-0 kubenswrapper[19170]: I0313 01:24:32.094999 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"349df320-5872-4352-8495-d7a5f9c4fc51","Type":"ContainerStarted","Data":"fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a"} Mar 13 01:24:33.105770 master-0 kubenswrapper[19170]: I0313 01:24:33.105685 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"349df320-5872-4352-8495-d7a5f9c4fc51","Type":"ContainerStarted","Data":"06301ee757d90cae6d7abf2771be6bb7c16b59735e2a44669e532a824d862fb5"} Mar 13 01:24:33.130744 master-0 kubenswrapper[19170]: I0313 01:24:33.130658 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=2.130623669 podStartE2EDuration="2.130623669s" podCreationTimestamp="2026-03-13 01:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:24:33.129893168 +0000 UTC m=+333.938014168" watchObservedRunningTime="2026-03-13 01:24:33.130623669 +0000 UTC m=+333.938744649" Mar 13 01:24:34.920046 master-0 kubenswrapper[19170]: I0313 01:24:34.919946 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:34.920046 master-0 kubenswrapper[19170]: I0313 01:24:34.920044 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:35.155821 master-0 kubenswrapper[19170]: I0313 01:24:35.155734 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:24:35.156198 master-0 kubenswrapper[19170]: I0313 01:24:35.156140 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="cbbab052-3cba-476d-a74f-edb7f738a73d" containerName="installer" containerID="cri-o://baa48640462f0be2da5e515e74f8d651ac9002e75301b39eb1cde0cd3495eec2" gracePeriod=30 Mar 13 01:24:36.544331 master-0 kubenswrapper[19170]: E0313 01:24:36.544254 19170 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/etcd-pod.yaml\": /etc/kubernetes/manifests/etcd-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.544431 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.544906 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" gracePeriod=30 Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.544945 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" gracePeriod=30 Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.544960 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" gracePeriod=30 Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.544950 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" gracePeriod=30 Mar 13 01:24:36.545441 master-0 kubenswrapper[19170]: I0313 01:24:36.545026 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" gracePeriod=30 Mar 13 01:24:36.547920 master-0 kubenswrapper[19170]: I0313 01:24:36.547877 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:24:36.548251 master-0 kubenswrapper[19170]: E0313 01:24:36.548216 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 01:24:36.548251 master-0 kubenswrapper[19170]: I0313 01:24:36.548239 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548264 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548274 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548286 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548294 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548319 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548327 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548343 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548351 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548363 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548371 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548391 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548399 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: E0313 01:24:36.548412 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 01:24:36.548518 master-0 kubenswrapper[19170]: I0313 01:24:36.548420 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548595 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548662 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548709 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548742 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548770 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548791 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548833 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 01:24:36.550395 master-0 kubenswrapper[19170]: I0313 01:24:36.548850 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 01:24:36.637871 master-0 kubenswrapper[19170]: I0313 01:24:36.637833 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.637971 master-0 kubenswrapper[19170]: I0313 01:24:36.637874 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.637971 master-0 kubenswrapper[19170]: I0313 01:24:36.637920 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.637971 master-0 kubenswrapper[19170]: I0313 01:24:36.637946 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.638065 master-0 kubenswrapper[19170]: I0313 01:24:36.637983 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.638065 master-0 kubenswrapper[19170]: I0313 01:24:36.638002 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739099 master-0 kubenswrapper[19170]: I0313 01:24:36.739043 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739320 master-0 kubenswrapper[19170]: I0313 01:24:36.739118 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739320 master-0 kubenswrapper[19170]: I0313 01:24:36.739176 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739320 master-0 kubenswrapper[19170]: I0313 01:24:36.739180 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739320 master-0 kubenswrapper[19170]: I0313 01:24:36.739209 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739320 master-0 kubenswrapper[19170]: I0313 01:24:36.739241 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739535 master-0 kubenswrapper[19170]: I0313 01:24:36.739377 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739587 master-0 kubenswrapper[19170]: I0313 01:24:36.739528 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739587 master-0 kubenswrapper[19170]: I0313 01:24:36.739519 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739734 master-0 kubenswrapper[19170]: I0313 01:24:36.739589 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739734 master-0 kubenswrapper[19170]: I0313 01:24:36.739620 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.739734 master-0 kubenswrapper[19170]: I0313 01:24:36.739708 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 01:24:36.800761 master-0 kubenswrapper[19170]: I0313 01:24:36.800261 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" containerName="oauth-openshift" containerID="cri-o://2c4aa5af4a8c63a3c9c2d75fe3e26ec69d431597f1535a695a705d218c7f603e" gracePeriod=15 Mar 13 01:24:37.145344 master-0 kubenswrapper[19170]: I0313 01:24:37.145248 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 01:24:37.147034 master-0 kubenswrapper[19170]: I0313 01:24:37.146987 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 01:24:37.149859 master-0 kubenswrapper[19170]: I0313 01:24:37.149771 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" exitCode=2 Mar 13 01:24:37.149859 master-0 kubenswrapper[19170]: I0313 01:24:37.149835 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" exitCode=0 Mar 13 01:24:37.149859 master-0 kubenswrapper[19170]: I0313 01:24:37.149856 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" exitCode=2 Mar 13 01:24:37.152953 master-0 kubenswrapper[19170]: I0313 01:24:37.152888 19170 generic.go:334] "Generic (PLEG): container finished" podID="dc96262c-7c20-490b-b90e-d1fba7a26a46" containerID="2c4aa5af4a8c63a3c9c2d75fe3e26ec69d431597f1535a695a705d218c7f603e" exitCode=0 Mar 13 01:24:37.153104 master-0 kubenswrapper[19170]: I0313 01:24:37.152951 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" event={"ID":"dc96262c-7c20-490b-b90e-d1fba7a26a46","Type":"ContainerDied","Data":"2c4aa5af4a8c63a3c9c2d75fe3e26ec69d431597f1535a695a705d218c7f603e"} Mar 13 01:24:37.409301 master-0 kubenswrapper[19170]: I0313 01:24:37.409235 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:37.452959 master-0 kubenswrapper[19170]: I0313 01:24:37.452919 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453140 master-0 kubenswrapper[19170]: I0313 01:24:37.452971 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453140 master-0 kubenswrapper[19170]: I0313 01:24:37.452995 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453140 master-0 kubenswrapper[19170]: I0313 01:24:37.453017 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453140 master-0 kubenswrapper[19170]: I0313 01:24:37.453058 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453140 master-0 kubenswrapper[19170]: I0313 01:24:37.453106 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453278 master-0 kubenswrapper[19170]: I0313 01:24:37.453147 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453278 master-0 kubenswrapper[19170]: I0313 01:24:37.453218 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453278 master-0 kubenswrapper[19170]: I0313 01:24:37.453237 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453278 master-0 kubenswrapper[19170]: I0313 01:24:37.453256 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453278 master-0 kubenswrapper[19170]: I0313 01:24:37.453275 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453424 master-0 kubenswrapper[19170]: I0313 01:24:37.453294 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55t7g\" (UniqueName: \"kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.453424 master-0 kubenswrapper[19170]: I0313 01:24:37.453359 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login\") pod \"dc96262c-7c20-490b-b90e-d1fba7a26a46\" (UID: \"dc96262c-7c20-490b-b90e-d1fba7a26a46\") " Mar 13 01:24:37.454258 master-0 kubenswrapper[19170]: I0313 01:24:37.454199 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:24:37.455453 master-0 kubenswrapper[19170]: I0313 01:24:37.455366 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:37.455498 master-0 kubenswrapper[19170]: I0313 01:24:37.455442 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:37.455902 master-0 kubenswrapper[19170]: I0313 01:24:37.455855 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:37.456076 master-0 kubenswrapper[19170]: I0313 01:24:37.456019 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:24:37.456736 master-0 kubenswrapper[19170]: I0313 01:24:37.456714 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.458400 master-0 kubenswrapper[19170]: I0313 01:24:37.458353 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.458489 master-0 kubenswrapper[19170]: I0313 01:24:37.458402 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.459090 master-0 kubenswrapper[19170]: I0313 01:24:37.459038 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.459286 master-0 kubenswrapper[19170]: I0313 01:24:37.459249 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.459603 master-0 kubenswrapper[19170]: I0313 01:24:37.459554 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.459653 master-0 kubenswrapper[19170]: I0313 01:24:37.459552 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g" (OuterVolumeSpecName: "kube-api-access-55t7g") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "kube-api-access-55t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:24:37.460505 master-0 kubenswrapper[19170]: I0313 01:24:37.460435 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "dc96262c-7c20-490b-b90e-d1fba7a26a46" (UID: "dc96262c-7c20-490b-b90e-d1fba7a26a46"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:24:37.555030 master-0 kubenswrapper[19170]: I0313 01:24:37.554950 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.555030 master-0 kubenswrapper[19170]: I0313 01:24:37.555002 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.555030 master-0 kubenswrapper[19170]: I0313 01:24:37.555017 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.555030 master-0 kubenswrapper[19170]: I0313 01:24:37.555030 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.555030 master-0 kubenswrapper[19170]: I0313 01:24:37.555042 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555058 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555072 19170 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555088 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555103 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55t7g\" (UniqueName: \"kubernetes.io/projected/dc96262c-7c20-490b-b90e-d1fba7a26a46-kube-api-access-55t7g\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555115 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555127 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555140 19170 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dc96262c-7c20-490b-b90e-d1fba7a26a46-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:37.556169 master-0 kubenswrapper[19170]: I0313 01:24:37.555152 19170 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dc96262c-7c20-490b-b90e-d1fba7a26a46-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:38.164894 master-0 kubenswrapper[19170]: I0313 01:24:38.164826 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" event={"ID":"dc96262c-7c20-490b-b90e-d1fba7a26a46","Type":"ContainerDied","Data":"06b33bfe8f677cbc17516783d3a2ea455d65925f812456d7b8d09c94a01f4a42"} Mar 13 01:24:38.164894 master-0 kubenswrapper[19170]: I0313 01:24:38.164892 19170 scope.go:117] "RemoveContainer" containerID="2c4aa5af4a8c63a3c9c2d75fe3e26ec69d431597f1535a695a705d218c7f603e" Mar 13 01:24:38.165256 master-0 kubenswrapper[19170]: I0313 01:24:38.164914 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" Mar 13 01:24:39.397717 master-0 kubenswrapper[19170]: I0313 01:24:39.382895 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:24:39.455237 master-0 kubenswrapper[19170]: I0313 01:24:39.455141 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:24:40.233498 master-0 kubenswrapper[19170]: I0313 01:24:40.233404 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 01:24:41.829004 master-0 kubenswrapper[19170]: I0313 01:24:41.828914 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:41.829889 master-0 kubenswrapper[19170]: I0313 01:24:41.829006 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:43.477417 master-0 kubenswrapper[19170]: I0313 01:24:43.477356 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:24:44.919839 master-0 kubenswrapper[19170]: I0313 01:24:44.919755 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:44.920835 master-0 kubenswrapper[19170]: I0313 01:24:44.919836 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:47.854614 master-0 kubenswrapper[19170]: E0313 01:24:47.854530 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 13 01:24:51.294493 master-0 kubenswrapper[19170]: I0313 01:24:51.294320 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:24:51.297073 master-0 kubenswrapper[19170]: I0313 01:24:51.297009 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/0.log" Mar 13 01:24:51.297265 master-0 kubenswrapper[19170]: I0313 01:24:51.297102 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="46c26b717b497322454dcf7c105249ed590c5f0f850f5c9e1de33f73e6f55637" exitCode=1 Mar 13 01:24:51.297265 master-0 kubenswrapper[19170]: I0313 01:24:51.297211 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"46c26b717b497322454dcf7c105249ed590c5f0f850f5c9e1de33f73e6f55637"} Mar 13 01:24:51.297265 master-0 kubenswrapper[19170]: I0313 01:24:51.297264 19170 scope.go:117] "RemoveContainer" containerID="c54d358f4c1b4f029bbd5121747727fca60685060770ce8f314c1f52c6412116" Mar 13 01:24:51.298494 master-0 kubenswrapper[19170]: I0313 01:24:51.298218 19170 scope.go:117] "RemoveContainer" containerID="46c26b717b497322454dcf7c105249ed590c5f0f850f5c9e1de33f73e6f55637" Mar 13 01:24:51.301901 master-0 kubenswrapper[19170]: I0313 01:24:51.301085 19170 generic.go:334] "Generic (PLEG): container finished" podID="a8c9ec23-a5af-44a1-859a-86629153ae8c" containerID="ede426974ce9adbfb3642c936de09459b7d89a52b0e5577a2c374ff413fc084f" exitCode=0 Mar 13 01:24:51.301901 master-0 kubenswrapper[19170]: I0313 01:24:51.301174 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"a8c9ec23-a5af-44a1-859a-86629153ae8c","Type":"ContainerDied","Data":"ede426974ce9adbfb3642c936de09459b7d89a52b0e5577a2c374ff413fc084f"} Mar 13 01:24:51.305031 master-0 kubenswrapper[19170]: I0313 01:24:51.304620 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cbbab052-3cba-476d-a74f-edb7f738a73d/installer/0.log" Mar 13 01:24:51.305031 master-0 kubenswrapper[19170]: I0313 01:24:51.304735 19170 generic.go:334] "Generic (PLEG): container finished" podID="cbbab052-3cba-476d-a74f-edb7f738a73d" containerID="baa48640462f0be2da5e515e74f8d651ac9002e75301b39eb1cde0cd3495eec2" exitCode=1 Mar 13 01:24:51.305031 master-0 kubenswrapper[19170]: I0313 01:24:51.304774 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cbbab052-3cba-476d-a74f-edb7f738a73d","Type":"ContainerDied","Data":"baa48640462f0be2da5e515e74f8d651ac9002e75301b39eb1cde0cd3495eec2"} Mar 13 01:24:51.425597 master-0 kubenswrapper[19170]: I0313 01:24:51.425548 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cbbab052-3cba-476d-a74f-edb7f738a73d/installer/0.log" Mar 13 01:24:51.425835 master-0 kubenswrapper[19170]: I0313 01:24:51.425679 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:51.503690 master-0 kubenswrapper[19170]: I0313 01:24:51.503612 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock\") pod \"cbbab052-3cba-476d-a74f-edb7f738a73d\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " Mar 13 01:24:51.503888 master-0 kubenswrapper[19170]: I0313 01:24:51.503742 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir\") pod \"cbbab052-3cba-476d-a74f-edb7f738a73d\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " Mar 13 01:24:51.503888 master-0 kubenswrapper[19170]: I0313 01:24:51.503745 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock" (OuterVolumeSpecName: "var-lock") pod "cbbab052-3cba-476d-a74f-edb7f738a73d" (UID: "cbbab052-3cba-476d-a74f-edb7f738a73d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:24:51.503888 master-0 kubenswrapper[19170]: I0313 01:24:51.503830 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cbbab052-3cba-476d-a74f-edb7f738a73d" (UID: "cbbab052-3cba-476d-a74f-edb7f738a73d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:24:51.504044 master-0 kubenswrapper[19170]: I0313 01:24:51.504011 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access\") pod \"cbbab052-3cba-476d-a74f-edb7f738a73d\" (UID: \"cbbab052-3cba-476d-a74f-edb7f738a73d\") " Mar 13 01:24:51.504720 master-0 kubenswrapper[19170]: I0313 01:24:51.504688 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:51.504762 master-0 kubenswrapper[19170]: I0313 01:24:51.504725 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cbbab052-3cba-476d-a74f-edb7f738a73d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:51.509807 master-0 kubenswrapper[19170]: I0313 01:24:51.509747 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cbbab052-3cba-476d-a74f-edb7f738a73d" (UID: "cbbab052-3cba-476d-a74f-edb7f738a73d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:24:51.607813 master-0 kubenswrapper[19170]: I0313 01:24:51.607741 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbbab052-3cba-476d-a74f-edb7f738a73d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:51.829845 master-0 kubenswrapper[19170]: I0313 01:24:51.829499 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:24:51.829845 master-0 kubenswrapper[19170]: I0313 01:24:51.829594 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:24:52.317351 master-0 kubenswrapper[19170]: I0313 01:24:52.317281 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_cbbab052-3cba-476d-a74f-edb7f738a73d/installer/0.log" Mar 13 01:24:52.318245 master-0 kubenswrapper[19170]: I0313 01:24:52.317422 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"cbbab052-3cba-476d-a74f-edb7f738a73d","Type":"ContainerDied","Data":"6f924cf47e3cfd7a4a0fd0f3e73b8ac47cb23dd4bd3db05f5031dd1a24065866"} Mar 13 01:24:52.318245 master-0 kubenswrapper[19170]: I0313 01:24:52.317476 19170 scope.go:117] "RemoveContainer" containerID="baa48640462f0be2da5e515e74f8d651ac9002e75301b39eb1cde0cd3495eec2" Mar 13 01:24:52.318245 master-0 kubenswrapper[19170]: I0313 01:24:52.317473 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 01:24:52.322190 master-0 kubenswrapper[19170]: I0313 01:24:52.322110 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:24:52.324495 master-0 kubenswrapper[19170]: I0313 01:24:52.324424 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d"} Mar 13 01:24:52.803905 master-0 kubenswrapper[19170]: I0313 01:24:52.803869 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:52.949534 master-0 kubenswrapper[19170]: I0313 01:24:52.949481 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir\") pod \"a8c9ec23-a5af-44a1-859a-86629153ae8c\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " Mar 13 01:24:52.949534 master-0 kubenswrapper[19170]: I0313 01:24:52.949585 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock\") pod \"a8c9ec23-a5af-44a1-859a-86629153ae8c\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " Mar 13 01:24:52.949961 master-0 kubenswrapper[19170]: I0313 01:24:52.949760 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access\") pod \"a8c9ec23-a5af-44a1-859a-86629153ae8c\" (UID: \"a8c9ec23-a5af-44a1-859a-86629153ae8c\") " Mar 13 01:24:52.950172 master-0 kubenswrapper[19170]: I0313 01:24:52.950124 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a8c9ec23-a5af-44a1-859a-86629153ae8c" (UID: "a8c9ec23-a5af-44a1-859a-86629153ae8c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:24:52.950336 master-0 kubenswrapper[19170]: I0313 01:24:52.950159 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock" (OuterVolumeSpecName: "var-lock") pod "a8c9ec23-a5af-44a1-859a-86629153ae8c" (UID: "a8c9ec23-a5af-44a1-859a-86629153ae8c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:24:52.954500 master-0 kubenswrapper[19170]: I0313 01:24:52.954448 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a8c9ec23-a5af-44a1-859a-86629153ae8c" (UID: "a8c9ec23-a5af-44a1-859a-86629153ae8c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:24:53.052787 master-0 kubenswrapper[19170]: I0313 01:24:53.052715 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:53.052787 master-0 kubenswrapper[19170]: I0313 01:24:53.052780 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a8c9ec23-a5af-44a1-859a-86629153ae8c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:53.052985 master-0 kubenswrapper[19170]: I0313 01:24:53.052802 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c9ec23-a5af-44a1-859a-86629153ae8c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:24:53.338812 master-0 kubenswrapper[19170]: I0313 01:24:53.338620 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 01:24:53.339562 master-0 kubenswrapper[19170]: I0313 01:24:53.338612 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"a8c9ec23-a5af-44a1-859a-86629153ae8c","Type":"ContainerDied","Data":"0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d"} Mar 13 01:24:53.339562 master-0 kubenswrapper[19170]: I0313 01:24:53.338863 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cd1519ceb2bd0e29ce61088d5327806ef7ea565e1a1a54bef9b9739444adb2d" Mar 13 01:24:54.919286 master-0 kubenswrapper[19170]: I0313 01:24:54.919169 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:24:54.919286 master-0 kubenswrapper[19170]: I0313 01:24:54.919263 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:24:56.483559 master-0 kubenswrapper[19170]: I0313 01:24:56.483318 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:24:56.486344 master-0 kubenswrapper[19170]: I0313 01:24:56.483587 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:24:56.490305 master-0 kubenswrapper[19170]: I0313 01:24:56.490220 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:24:57.856777 master-0 kubenswrapper[19170]: E0313 01:24:57.856543 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:25:01.829786 master-0 kubenswrapper[19170]: I0313 01:25:01.829692 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:01.830727 master-0 kubenswrapper[19170]: I0313 01:25:01.829781 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:04.918742 master-0 kubenswrapper[19170]: I0313 01:25:04.918686 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:04.920010 master-0 kubenswrapper[19170]: I0313 01:25:04.919962 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:06.489734 master-0 kubenswrapper[19170]: I0313 01:25:06.489667 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:25:06.716760 master-0 kubenswrapper[19170]: I0313 01:25:06.716688 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 01:25:06.717866 master-0 kubenswrapper[19170]: I0313 01:25:06.717837 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 01:25:06.719156 master-0 kubenswrapper[19170]: I0313 01:25:06.719104 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 13 01:25:06.719743 master-0 kubenswrapper[19170]: I0313 01:25:06.719693 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 13 01:25:06.721515 master-0 kubenswrapper[19170]: I0313 01:25:06.721485 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:25:06.801316 master-0 kubenswrapper[19170]: I0313 01:25:06.801137 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.801316 master-0 kubenswrapper[19170]: I0313 01:25:06.801242 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.801316 master-0 kubenswrapper[19170]: I0313 01:25:06.801317 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801313 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801447 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801461 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801537 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801508 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801583 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.801671 master-0 kubenswrapper[19170]: I0313 01:25:06.801600 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.802109 master-0 kubenswrapper[19170]: I0313 01:25:06.801823 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 01:25:06.802707 master-0 kubenswrapper[19170]: I0313 01:25:06.802399 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:06.803150 master-0 kubenswrapper[19170]: I0313 01:25:06.803100 19170 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:06.803150 master-0 kubenswrapper[19170]: I0313 01:25:06.803142 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:06.803329 master-0 kubenswrapper[19170]: I0313 01:25:06.803161 19170 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:06.803329 master-0 kubenswrapper[19170]: I0313 01:25:06.803180 19170 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:06.803329 master-0 kubenswrapper[19170]: I0313 01:25:06.803197 19170 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:06.803329 master-0 kubenswrapper[19170]: I0313 01:25:06.803213 19170 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:07.434452 master-0 kubenswrapper[19170]: I0313 01:25:07.434353 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 13 01:25:07.485717 master-0 kubenswrapper[19170]: I0313 01:25:07.485648 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 01:25:07.486762 master-0 kubenswrapper[19170]: I0313 01:25:07.486704 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 01:25:07.487617 master-0 kubenswrapper[19170]: I0313 01:25:07.487570 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 13 01:25:07.488010 master-0 kubenswrapper[19170]: I0313 01:25:07.487971 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 13 01:25:07.489478 master-0 kubenswrapper[19170]: I0313 01:25:07.489426 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" exitCode=137 Mar 13 01:25:07.489478 master-0 kubenswrapper[19170]: I0313 01:25:07.489453 19170 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" exitCode=137 Mar 13 01:25:07.489735 master-0 kubenswrapper[19170]: I0313 01:25:07.489496 19170 scope.go:117] "RemoveContainer" containerID="945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" Mar 13 01:25:07.489735 master-0 kubenswrapper[19170]: I0313 01:25:07.489670 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:25:07.510302 master-0 kubenswrapper[19170]: I0313 01:25:07.510249 19170 scope.go:117] "RemoveContainer" containerID="5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" Mar 13 01:25:07.528009 master-0 kubenswrapper[19170]: E0313 01:25:07.527403 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-e850e3b221564f84a8bf622e78e98b3c458e66820fad70083408ddbec82a1cd3\": RecentStats: unable to find data in memory cache]" Mar 13 01:25:07.531423 master-0 kubenswrapper[19170]: I0313 01:25:07.531381 19170 scope.go:117] "RemoveContainer" containerID="74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" Mar 13 01:25:07.545372 master-0 kubenswrapper[19170]: I0313 01:25:07.545332 19170 scope.go:117] "RemoveContainer" containerID="102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" Mar 13 01:25:07.559547 master-0 kubenswrapper[19170]: I0313 01:25:07.558974 19170 scope.go:117] "RemoveContainer" containerID="094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" Mar 13 01:25:07.574028 master-0 kubenswrapper[19170]: I0313 01:25:07.574003 19170 scope.go:117] "RemoveContainer" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" Mar 13 01:25:07.594202 master-0 kubenswrapper[19170]: I0313 01:25:07.594162 19170 scope.go:117] "RemoveContainer" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" Mar 13 01:25:07.613930 master-0 kubenswrapper[19170]: I0313 01:25:07.613873 19170 scope.go:117] "RemoveContainer" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" Mar 13 01:25:07.636698 master-0 kubenswrapper[19170]: I0313 01:25:07.636609 19170 scope.go:117] "RemoveContainer" containerID="945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" Mar 13 01:25:07.637313 master-0 kubenswrapper[19170]: E0313 01:25:07.637256 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b\": container with ID starting with 945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b not found: ID does not exist" containerID="945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" Mar 13 01:25:07.637313 master-0 kubenswrapper[19170]: I0313 01:25:07.637295 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b"} err="failed to get container status \"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b\": rpc error: code = NotFound desc = could not find container \"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b\": container with ID starting with 945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b not found: ID does not exist" Mar 13 01:25:07.637583 master-0 kubenswrapper[19170]: I0313 01:25:07.637321 19170 scope.go:117] "RemoveContainer" containerID="5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" Mar 13 01:25:07.637884 master-0 kubenswrapper[19170]: E0313 01:25:07.637765 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd\": container with ID starting with 5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd not found: ID does not exist" containerID="5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" Mar 13 01:25:07.638033 master-0 kubenswrapper[19170]: I0313 01:25:07.637864 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd"} err="failed to get container status \"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd\": rpc error: code = NotFound desc = could not find container \"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd\": container with ID starting with 5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd not found: ID does not exist" Mar 13 01:25:07.638033 master-0 kubenswrapper[19170]: I0313 01:25:07.637907 19170 scope.go:117] "RemoveContainer" containerID="74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" Mar 13 01:25:07.638363 master-0 kubenswrapper[19170]: E0313 01:25:07.638296 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111\": container with ID starting with 74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111 not found: ID does not exist" containerID="74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" Mar 13 01:25:07.638363 master-0 kubenswrapper[19170]: I0313 01:25:07.638325 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111"} err="failed to get container status \"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111\": rpc error: code = NotFound desc = could not find container \"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111\": container with ID starting with 74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111 not found: ID does not exist" Mar 13 01:25:07.638363 master-0 kubenswrapper[19170]: I0313 01:25:07.638344 19170 scope.go:117] "RemoveContainer" containerID="102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" Mar 13 01:25:07.638843 master-0 kubenswrapper[19170]: E0313 01:25:07.638799 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e\": container with ID starting with 102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e not found: ID does not exist" containerID="102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" Mar 13 01:25:07.638843 master-0 kubenswrapper[19170]: I0313 01:25:07.638822 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e"} err="failed to get container status \"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e\": rpc error: code = NotFound desc = could not find container \"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e\": container with ID starting with 102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e not found: ID does not exist" Mar 13 01:25:07.638843 master-0 kubenswrapper[19170]: I0313 01:25:07.638837 19170 scope.go:117] "RemoveContainer" containerID="094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" Mar 13 01:25:07.639432 master-0 kubenswrapper[19170]: E0313 01:25:07.639228 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719\": container with ID starting with 094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719 not found: ID does not exist" containerID="094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" Mar 13 01:25:07.639432 master-0 kubenswrapper[19170]: I0313 01:25:07.639281 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719"} err="failed to get container status \"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719\": rpc error: code = NotFound desc = could not find container \"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719\": container with ID starting with 094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719 not found: ID does not exist" Mar 13 01:25:07.639432 master-0 kubenswrapper[19170]: I0313 01:25:07.639315 19170 scope.go:117] "RemoveContainer" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" Mar 13 01:25:07.639786 master-0 kubenswrapper[19170]: E0313 01:25:07.639745 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc\": container with ID starting with b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc not found: ID does not exist" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" Mar 13 01:25:07.639786 master-0 kubenswrapper[19170]: I0313 01:25:07.639768 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc"} err="failed to get container status \"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc\": rpc error: code = NotFound desc = could not find container \"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc\": container with ID starting with b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc not found: ID does not exist" Mar 13 01:25:07.639786 master-0 kubenswrapper[19170]: I0313 01:25:07.639784 19170 scope.go:117] "RemoveContainer" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" Mar 13 01:25:07.640321 master-0 kubenswrapper[19170]: E0313 01:25:07.640132 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee\": container with ID starting with 77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee not found: ID does not exist" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" Mar 13 01:25:07.640321 master-0 kubenswrapper[19170]: I0313 01:25:07.640178 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee"} err="failed to get container status \"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee\": rpc error: code = NotFound desc = could not find container \"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee\": container with ID starting with 77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee not found: ID does not exist" Mar 13 01:25:07.640321 master-0 kubenswrapper[19170]: I0313 01:25:07.640206 19170 scope.go:117] "RemoveContainer" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" Mar 13 01:25:07.640661 master-0 kubenswrapper[19170]: E0313 01:25:07.640564 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a\": container with ID starting with 3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a not found: ID does not exist" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" Mar 13 01:25:07.640661 master-0 kubenswrapper[19170]: I0313 01:25:07.640602 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a"} err="failed to get container status \"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a\": rpc error: code = NotFound desc = could not find container \"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a\": container with ID starting with 3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a not found: ID does not exist" Mar 13 01:25:07.640661 master-0 kubenswrapper[19170]: I0313 01:25:07.640618 19170 scope.go:117] "RemoveContainer" containerID="945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b" Mar 13 01:25:07.640940 master-0 kubenswrapper[19170]: I0313 01:25:07.640878 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b"} err="failed to get container status \"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b\": rpc error: code = NotFound desc = could not find container \"945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b\": container with ID starting with 945467a15df4a8d4dc173e105fd4e0e21428d428ca18f15c847be9161b3b730b not found: ID does not exist" Mar 13 01:25:07.640940 master-0 kubenswrapper[19170]: I0313 01:25:07.640898 19170 scope.go:117] "RemoveContainer" containerID="5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd" Mar 13 01:25:07.641271 master-0 kubenswrapper[19170]: I0313 01:25:07.641214 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd"} err="failed to get container status \"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd\": rpc error: code = NotFound desc = could not find container \"5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd\": container with ID starting with 5686b7e065921e4a505694fd8d39d3cf4e50087c1b65d84bd736ed81d6e1a8bd not found: ID does not exist" Mar 13 01:25:07.641271 master-0 kubenswrapper[19170]: I0313 01:25:07.641234 19170 scope.go:117] "RemoveContainer" containerID="74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111" Mar 13 01:25:07.642122 master-0 kubenswrapper[19170]: I0313 01:25:07.641753 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111"} err="failed to get container status \"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111\": rpc error: code = NotFound desc = could not find container \"74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111\": container with ID starting with 74e7eddcdd11f244988198476d048b74bb9bbeb97aa14b3f52b59d34eb6ac111 not found: ID does not exist" Mar 13 01:25:07.642122 master-0 kubenswrapper[19170]: I0313 01:25:07.641813 19170 scope.go:117] "RemoveContainer" containerID="102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e" Mar 13 01:25:07.642594 master-0 kubenswrapper[19170]: I0313 01:25:07.642466 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e"} err="failed to get container status \"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e\": rpc error: code = NotFound desc = could not find container \"102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e\": container with ID starting with 102131ace48d9559e7bd10df51cf1337602d058abc022e98bf79c9ff37b7ad8e not found: ID does not exist" Mar 13 01:25:07.642594 master-0 kubenswrapper[19170]: I0313 01:25:07.642500 19170 scope.go:117] "RemoveContainer" containerID="094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719" Mar 13 01:25:07.642935 master-0 kubenswrapper[19170]: I0313 01:25:07.642901 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719"} err="failed to get container status \"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719\": rpc error: code = NotFound desc = could not find container \"094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719\": container with ID starting with 094f4a86fe59b920ac10166d9e8ac01032d74617f321831a2a2f931f8b6e5719 not found: ID does not exist" Mar 13 01:25:07.642935 master-0 kubenswrapper[19170]: I0313 01:25:07.642928 19170 scope.go:117] "RemoveContainer" containerID="b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc" Mar 13 01:25:07.643440 master-0 kubenswrapper[19170]: I0313 01:25:07.643272 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc"} err="failed to get container status \"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc\": rpc error: code = NotFound desc = could not find container \"b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc\": container with ID starting with b828b9875ee5a4c86417590e983cd65cc09701fdf10c008a5d18f75844d13abc not found: ID does not exist" Mar 13 01:25:07.643440 master-0 kubenswrapper[19170]: I0313 01:25:07.643313 19170 scope.go:117] "RemoveContainer" containerID="77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee" Mar 13 01:25:07.643797 master-0 kubenswrapper[19170]: I0313 01:25:07.643686 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee"} err="failed to get container status \"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee\": rpc error: code = NotFound desc = could not find container \"77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee\": container with ID starting with 77c3585c80eaa5d7f0873a20062967927c8ada28f17899fd783cafcf3b80dbee not found: ID does not exist" Mar 13 01:25:07.643797 master-0 kubenswrapper[19170]: I0313 01:25:07.643715 19170 scope.go:117] "RemoveContainer" containerID="3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a" Mar 13 01:25:07.644104 master-0 kubenswrapper[19170]: I0313 01:25:07.644023 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a"} err="failed to get container status \"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a\": rpc error: code = NotFound desc = could not find container \"3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a\": container with ID starting with 3acba3e036ccbe49f5868539abac9a81fe12130cf6faa99400b25eb27b4d707a not found: ID does not exist" Mar 13 01:25:07.858337 master-0 kubenswrapper[19170]: E0313 01:25:07.857738 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:25:10.593272 master-0 kubenswrapper[19170]: E0313 01:25:10.593111 19170 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c4224348b3f0f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:24:36.544929551 +0000 UTC m=+337.353050521,LastTimestamp:2026-03-13 01:24:36.544929551 +0000 UTC m=+337.353050521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:25:11.829879 master-0 kubenswrapper[19170]: I0313 01:25:11.829799 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:11.830703 master-0 kubenswrapper[19170]: I0313 01:25:11.829887 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:14.919034 master-0 kubenswrapper[19170]: I0313 01:25:14.918950 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:14.919937 master-0 kubenswrapper[19170]: I0313 01:25:14.919038 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:16.419294 master-0 kubenswrapper[19170]: I0313 01:25:16.419191 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:25:16.446424 master-0 kubenswrapper[19170]: I0313 01:25:16.446336 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:25:16.446424 master-0 kubenswrapper[19170]: I0313 01:25:16.446409 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:25:17.589667 master-0 kubenswrapper[19170]: I0313 01:25:17.589537 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-master-0_349df320-5872-4352-8495-d7a5f9c4fc51/installer/0.log" Mar 13 01:25:17.589667 master-0 kubenswrapper[19170]: I0313 01:25:17.589615 19170 generic.go:334] "Generic (PLEG): container finished" podID="349df320-5872-4352-8495-d7a5f9c4fc51" containerID="06301ee757d90cae6d7abf2771be6bb7c16b59735e2a44669e532a824d862fb5" exitCode=1 Mar 13 01:25:17.590601 master-0 kubenswrapper[19170]: I0313 01:25:17.589722 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"349df320-5872-4352-8495-d7a5f9c4fc51","Type":"ContainerDied","Data":"06301ee757d90cae6d7abf2771be6bb7c16b59735e2a44669e532a824d862fb5"} Mar 13 01:25:17.859704 master-0 kubenswrapper[19170]: E0313 01:25:17.859495 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:25:19.029733 master-0 kubenswrapper[19170]: I0313 01:25:19.028430 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-master-0_349df320-5872-4352-8495-d7a5f9c4fc51/installer/0.log" Mar 13 01:25:19.029733 master-0 kubenswrapper[19170]: I0313 01:25:19.028532 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:25:19.134434 master-0 kubenswrapper[19170]: I0313 01:25:19.134377 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock\") pod \"349df320-5872-4352-8495-d7a5f9c4fc51\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " Mar 13 01:25:19.134664 master-0 kubenswrapper[19170]: I0313 01:25:19.134471 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir\") pod \"349df320-5872-4352-8495-d7a5f9c4fc51\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " Mar 13 01:25:19.134664 master-0 kubenswrapper[19170]: I0313 01:25:19.134512 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock" (OuterVolumeSpecName: "var-lock") pod "349df320-5872-4352-8495-d7a5f9c4fc51" (UID: "349df320-5872-4352-8495-d7a5f9c4fc51"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:19.134664 master-0 kubenswrapper[19170]: I0313 01:25:19.134559 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "349df320-5872-4352-8495-d7a5f9c4fc51" (UID: "349df320-5872-4352-8495-d7a5f9c4fc51"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:25:19.134802 master-0 kubenswrapper[19170]: I0313 01:25:19.134711 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access\") pod \"349df320-5872-4352-8495-d7a5f9c4fc51\" (UID: \"349df320-5872-4352-8495-d7a5f9c4fc51\") " Mar 13 01:25:19.135093 master-0 kubenswrapper[19170]: I0313 01:25:19.135064 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:19.135093 master-0 kubenswrapper[19170]: I0313 01:25:19.135085 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/349df320-5872-4352-8495-d7a5f9c4fc51-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:19.139621 master-0 kubenswrapper[19170]: I0313 01:25:19.139544 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "349df320-5872-4352-8495-d7a5f9c4fc51" (UID: "349df320-5872-4352-8495-d7a5f9c4fc51"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:25:19.236573 master-0 kubenswrapper[19170]: I0313 01:25:19.236414 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349df320-5872-4352-8495-d7a5f9c4fc51-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:25:19.610570 master-0 kubenswrapper[19170]: I0313 01:25:19.610412 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-master-0_349df320-5872-4352-8495-d7a5f9c4fc51/installer/0.log" Mar 13 01:25:19.610570 master-0 kubenswrapper[19170]: I0313 01:25:19.610497 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"349df320-5872-4352-8495-d7a5f9c4fc51","Type":"ContainerDied","Data":"fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a"} Mar 13 01:25:19.610570 master-0 kubenswrapper[19170]: I0313 01:25:19.610540 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd055a301b36642375e50dcc5dd9cfaee924a7826f5b2dfec9aab7ebc5036f7a" Mar 13 01:25:19.611173 master-0 kubenswrapper[19170]: I0313 01:25:19.610676 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 13 01:25:21.829746 master-0 kubenswrapper[19170]: I0313 01:25:21.829611 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:21.830566 master-0 kubenswrapper[19170]: I0313 01:25:21.829752 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:24.674575 master-0 kubenswrapper[19170]: I0313 01:25:24.674490 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/1.log" Mar 13 01:25:24.676452 master-0 kubenswrapper[19170]: I0313 01:25:24.675440 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/0.log" Mar 13 01:25:24.676729 master-0 kubenswrapper[19170]: I0313 01:25:24.676386 19170 generic.go:334] "Generic (PLEG): container finished" podID="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" containerID="99dc9aacbe53f2463fbb1d6c45782c44f72e7b13c67642bb7d0b4839b16638fe" exitCode=1 Mar 13 01:25:24.676729 master-0 kubenswrapper[19170]: I0313 01:25:24.676482 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerDied","Data":"99dc9aacbe53f2463fbb1d6c45782c44f72e7b13c67642bb7d0b4839b16638fe"} Mar 13 01:25:24.676729 master-0 kubenswrapper[19170]: I0313 01:25:24.676582 19170 scope.go:117] "RemoveContainer" containerID="8a645f3b183df337e2bb471f99ef18a88ef8e03f78dccc126ecc0415b40abdab" Mar 13 01:25:24.678070 master-0 kubenswrapper[19170]: I0313 01:25:24.678004 19170 scope.go:117] "RemoveContainer" containerID="99dc9aacbe53f2463fbb1d6c45782c44f72e7b13c67642bb7d0b4839b16638fe" Mar 13 01:25:24.919574 master-0 kubenswrapper[19170]: I0313 01:25:24.919496 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:24.919862 master-0 kubenswrapper[19170]: I0313 01:25:24.919615 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:25.690106 master-0 kubenswrapper[19170]: I0313 01:25:25.690037 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-znqwc_d1153bb3-30dd-458f-b0a4-c05358a8b3f8/approver/1.log" Mar 13 01:25:25.691070 master-0 kubenswrapper[19170]: I0313 01:25:25.690662 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-znqwc" event={"ID":"d1153bb3-30dd-458f-b0a4-c05358a8b3f8","Type":"ContainerStarted","Data":"9cbaac463744649bb491b69b1e48bc53cbfff311dcb00c4dc51ee254ea4998f4"} Mar 13 01:25:27.860848 master-0 kubenswrapper[19170]: E0313 01:25:27.860758 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:25:27.861952 master-0 kubenswrapper[19170]: I0313 01:25:27.861605 19170 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 01:25:31.831524 master-0 kubenswrapper[19170]: I0313 01:25:31.831384 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:31.831524 master-0 kubenswrapper[19170]: I0313 01:25:31.831467 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:34.919739 master-0 kubenswrapper[19170]: I0313 01:25:34.919660 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:34.920452 master-0 kubenswrapper[19170]: I0313 01:25:34.919771 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:37.411758 master-0 kubenswrapper[19170]: I0313 01:25:37.411668 19170 status_manager.go:851] "Failed to get status for pod" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 13 01:25:37.862964 master-0 kubenswrapper[19170]: E0313 01:25:37.862792 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 13 01:25:41.829172 master-0 kubenswrapper[19170]: I0313 01:25:41.829093 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:41.830062 master-0 kubenswrapper[19170]: I0313 01:25:41.829171 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:44.596341 master-0 kubenswrapper[19170]: E0313 01:25:44.596162 19170 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c4224348b8933 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:24:36.544948531 +0000 UTC m=+337.353069491,LastTimestamp:2026-03-13 01:24:36.544948531 +0000 UTC m=+337.353069491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:25:44.919720 master-0 kubenswrapper[19170]: I0313 01:25:44.919623 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:44.920015 master-0 kubenswrapper[19170]: I0313 01:25:44.919756 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:48.066017 master-0 kubenswrapper[19170]: E0313 01:25:48.065900 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 13 01:25:50.450270 master-0 kubenswrapper[19170]: E0313 01:25:50.450204 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:25:50.451461 master-0 kubenswrapper[19170]: I0313 01:25:50.451054 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 01:25:50.485295 master-0 kubenswrapper[19170]: W0313 01:25:50.485227 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-0706d47bff2b9347dbee0590ac91d7f48af48367a44cf20983cee2580b24e3b7 WatchSource:0}: Error finding container 0706d47bff2b9347dbee0590ac91d7f48af48367a44cf20983cee2580b24e3b7: Status 404 returned error can't find the container with id 0706d47bff2b9347dbee0590ac91d7f48af48367a44cf20983cee2580b24e3b7 Mar 13 01:25:50.943449 master-0 kubenswrapper[19170]: I0313 01:25:50.943348 19170 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="fc9763e641842404438a6d531711432534d9f35e2fdfbd41fea65af5ef76e6e6" exitCode=0 Mar 13 01:25:50.943449 master-0 kubenswrapper[19170]: I0313 01:25:50.943419 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"fc9763e641842404438a6d531711432534d9f35e2fdfbd41fea65af5ef76e6e6"} Mar 13 01:25:50.943898 master-0 kubenswrapper[19170]: I0313 01:25:50.943471 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"0706d47bff2b9347dbee0590ac91d7f48af48367a44cf20983cee2580b24e3b7"} Mar 13 01:25:50.944086 master-0 kubenswrapper[19170]: I0313 01:25:50.944023 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:25:50.944086 master-0 kubenswrapper[19170]: I0313 01:25:50.944068 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:25:51.829663 master-0 kubenswrapper[19170]: I0313 01:25:51.829557 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:25:51.830527 master-0 kubenswrapper[19170]: I0313 01:25:51.829690 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:25:54.919920 master-0 kubenswrapper[19170]: I0313 01:25:54.919816 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:25:54.920742 master-0 kubenswrapper[19170]: I0313 01:25:54.919931 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:25:58.467050 master-0 kubenswrapper[19170]: E0313 01:25:58.466622 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 13 01:26:01.829437 master-0 kubenswrapper[19170]: I0313 01:26:01.829328 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:01.830305 master-0 kubenswrapper[19170]: I0313 01:26:01.829434 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:04.919997 master-0 kubenswrapper[19170]: I0313 01:26:04.919905 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:04.923377 master-0 kubenswrapper[19170]: I0313 01:26:04.920023 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:26:07.526330 master-0 kubenswrapper[19170]: E0313 01:26:07.526225 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db888f0_51b6_43cf_8337_69d2d5cc2b0a.slice/crio-conmon-177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db888f0_51b6_43cf_8337_69d2d5cc2b0a.slice/crio-177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:26:07.889844 master-0 kubenswrapper[19170]: I0313 01:26:07.889807 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:26:08.032197 master-0 kubenswrapper[19170]: I0313 01:26:08.032141 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032473 master-0 kubenswrapper[19170]: I0313 01:26:08.032202 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032473 master-0 kubenswrapper[19170]: I0313 01:26:08.032273 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032473 master-0 kubenswrapper[19170]: I0313 01:26:08.032323 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032618 master-0 kubenswrapper[19170]: I0313 01:26:08.032586 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032773 master-0 kubenswrapper[19170]: I0313 01:26:08.032742 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032843 master-0 kubenswrapper[19170]: I0313 01:26:08.032785 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") pod \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\" (UID: \"9db888f0-51b6-43cf-8337-69d2d5cc2b0a\") " Mar 13 01:26:08.032843 master-0 kubenswrapper[19170]: I0313 01:26:08.032790 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log" (OuterVolumeSpecName: "audit-log") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:26:08.033061 master-0 kubenswrapper[19170]: I0313 01:26:08.033033 19170 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-audit-log\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.033533 master-0 kubenswrapper[19170]: I0313 01:26:08.033457 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:26:08.033624 master-0 kubenswrapper[19170]: I0313 01:26:08.033575 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:26:08.036071 master-0 kubenswrapper[19170]: I0313 01:26:08.036011 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:26:08.036405 master-0 kubenswrapper[19170]: I0313 01:26:08.036370 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:26:08.037697 master-0 kubenswrapper[19170]: I0313 01:26:08.037659 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56" (OuterVolumeSpecName: "kube-api-access-8gj56") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "kube-api-access-8gj56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:26:08.037914 master-0 kubenswrapper[19170]: I0313 01:26:08.037858 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9db888f0-51b6-43cf-8337-69d2d5cc2b0a" (UID: "9db888f0-51b6-43cf-8337-69d2d5cc2b0a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:26:08.111272 master-0 kubenswrapper[19170]: I0313 01:26:08.111147 19170 generic.go:334] "Generic (PLEG): container finished" podID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" containerID="177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb" exitCode=0 Mar 13 01:26:08.111272 master-0 kubenswrapper[19170]: I0313 01:26:08.111204 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" event={"ID":"9db888f0-51b6-43cf-8337-69d2d5cc2b0a","Type":"ContainerDied","Data":"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb"} Mar 13 01:26:08.111272 master-0 kubenswrapper[19170]: I0313 01:26:08.111235 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" event={"ID":"9db888f0-51b6-43cf-8337-69d2d5cc2b0a","Type":"ContainerDied","Data":"2df89949d75d6b332f6e6e3505de7ef88eb15830a7e3e30680b63b8f7fc0d9ff"} Mar 13 01:26:08.111272 master-0 kubenswrapper[19170]: I0313 01:26:08.111261 19170 scope.go:117] "RemoveContainer" containerID="177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb" Mar 13 01:26:08.111668 master-0 kubenswrapper[19170]: I0313 01:26:08.111615 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5575f756f4-hqr5q" Mar 13 01:26:08.133567 master-0 kubenswrapper[19170]: I0313 01:26:08.133522 19170 scope.go:117] "RemoveContainer" containerID="177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb" Mar 13 01:26:08.134356 master-0 kubenswrapper[19170]: E0313 01:26:08.134331 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb\": container with ID starting with 177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb not found: ID does not exist" containerID="177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb" Mar 13 01:26:08.134497 master-0 kubenswrapper[19170]: I0313 01:26:08.134461 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb"} err="failed to get container status \"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb\": rpc error: code = NotFound desc = could not find container \"177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb\": container with ID starting with 177c5ae11bdea7716ced4a8d9a5025a0f7f5545fda78c1cbb79962beaa511cbb not found: ID does not exist" Mar 13 01:26:08.134698 master-0 kubenswrapper[19170]: I0313 01:26:08.134598 19170 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.134698 master-0 kubenswrapper[19170]: I0313 01:26:08.134661 19170 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.134698 master-0 kubenswrapper[19170]: I0313 01:26:08.134688 19170 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.134859 master-0 kubenswrapper[19170]: I0313 01:26:08.134712 19170 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.134859 master-0 kubenswrapper[19170]: I0313 01:26:08.134732 19170 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:08.134859 master-0 kubenswrapper[19170]: I0313 01:26:08.134754 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gj56\" (UniqueName: \"kubernetes.io/projected/9db888f0-51b6-43cf-8337-69d2d5cc2b0a-kube-api-access-8gj56\") on node \"master-0\" DevicePath \"\"" Mar 13 01:26:09.268086 master-0 kubenswrapper[19170]: E0313 01:26:09.267979 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 13 01:26:11.829072 master-0 kubenswrapper[19170]: I0313 01:26:11.829014 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:11.829959 master-0 kubenswrapper[19170]: I0313 01:26:11.829882 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:14.919703 master-0 kubenswrapper[19170]: I0313 01:26:14.919533 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:14.919703 master-0 kubenswrapper[19170]: I0313 01:26:14.919608 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:26:16.201612 master-0 kubenswrapper[19170]: I0313 01:26:16.201502 19170 generic.go:334] "Generic (PLEG): container finished" podID="78d2cd80-23b9-426d-a7ac-1daa27668a47" containerID="7d069f7cf40ce00e10c1f0f6baa994ec7a0d37d154f8f16c691fae327fe2644d" exitCode=0 Mar 13 01:26:16.201612 master-0 kubenswrapper[19170]: I0313 01:26:16.201592 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerDied","Data":"7d069f7cf40ce00e10c1f0f6baa994ec7a0d37d154f8f16c691fae327fe2644d"} Mar 13 01:26:16.202568 master-0 kubenswrapper[19170]: I0313 01:26:16.201700 19170 scope.go:117] "RemoveContainer" containerID="a7777dcfd1f119b7ebee2758548cf5f87c3fa74673968a6a77a0056cee905b05" Mar 13 01:26:16.202818 master-0 kubenswrapper[19170]: I0313 01:26:16.202692 19170 scope.go:117] "RemoveContainer" containerID="7d069f7cf40ce00e10c1f0f6baa994ec7a0d37d154f8f16c691fae327fe2644d" Mar 13 01:26:17.218681 master-0 kubenswrapper[19170]: I0313 01:26:17.218556 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" event={"ID":"78d2cd80-23b9-426d-a7ac-1daa27668a47","Type":"ContainerStarted","Data":"bf428596283765748b73540fb0c853d0831e933540d2d1fd6147f90b5e055249"} Mar 13 01:26:17.219745 master-0 kubenswrapper[19170]: I0313 01:26:17.219085 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:26:17.221630 master-0 kubenswrapper[19170]: I0313 01:26:17.221194 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-dszg5" Mar 13 01:26:18.600411 master-0 kubenswrapper[19170]: E0313 01:26:18.600209 19170 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c4224348c48ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:24:36.544997613 +0000 UTC m=+337.353118623,LastTimestamp:2026-03-13 01:24:36.544997613 +0000 UTC m=+337.353118623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:26:20.870275 master-0 kubenswrapper[19170]: E0313 01:26:20.870135 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 13 01:26:21.829318 master-0 kubenswrapper[19170]: I0313 01:26:21.829207 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:21.829318 master-0 kubenswrapper[19170]: I0313 01:26:21.829288 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:24.920143 master-0 kubenswrapper[19170]: I0313 01:26:24.920056 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:24.920933 master-0 kubenswrapper[19170]: I0313 01:26:24.920145 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:26:24.947154 master-0 kubenswrapper[19170]: E0313 01:26:24.947095 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:26:26.313111 master-0 kubenswrapper[19170]: I0313 01:26:26.312931 19170 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="810c32d95ee3abd2096b5c35119672b663c69f6efc4df1a61a4016174a4ba458" exitCode=0 Mar 13 01:26:26.313111 master-0 kubenswrapper[19170]: I0313 01:26:26.312992 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"810c32d95ee3abd2096b5c35119672b663c69f6efc4df1a61a4016174a4ba458"} Mar 13 01:26:26.314323 master-0 kubenswrapper[19170]: I0313 01:26:26.313977 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:26:26.314323 master-0 kubenswrapper[19170]: I0313 01:26:26.314007 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:26:27.320170 master-0 kubenswrapper[19170]: I0313 01:26:27.320064 19170 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-2wh5w container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.42:8081/readyz\": dial tcp 10.128.0.42:8081: connect: connection refused" start-of-body= Mar 13 01:26:27.320170 master-0 kubenswrapper[19170]: I0313 01:26:27.320160 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" podUID="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.42:8081/readyz\": dial tcp 10.128.0.42:8081: connect: connection refused" Mar 13 01:26:27.324574 master-0 kubenswrapper[19170]: I0313 01:26:27.324481 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/1.log" Mar 13 01:26:27.326133 master-0 kubenswrapper[19170]: I0313 01:26:27.326078 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/0.log" Mar 13 01:26:27.326276 master-0 kubenswrapper[19170]: I0313 01:26:27.326162 19170 generic.go:334] "Generic (PLEG): container finished" podID="30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3" containerID="7ef67c4dbd8426a3e3af7aa349a5cbaefed2fde80e4c7f48ba81fe002ea31f34" exitCode=1 Mar 13 01:26:27.326276 master-0 kubenswrapper[19170]: I0313 01:26:27.326207 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerDied","Data":"7ef67c4dbd8426a3e3af7aa349a5cbaefed2fde80e4c7f48ba81fe002ea31f34"} Mar 13 01:26:27.326276 master-0 kubenswrapper[19170]: I0313 01:26:27.326257 19170 scope.go:117] "RemoveContainer" containerID="5c71af40ce92aac85f827794e9d207c4ed4bc300599f6601dc56da9d0a8b0d8b" Mar 13 01:26:27.326963 master-0 kubenswrapper[19170]: I0313 01:26:27.326903 19170 scope.go:117] "RemoveContainer" containerID="7ef67c4dbd8426a3e3af7aa349a5cbaefed2fde80e4c7f48ba81fe002ea31f34" Mar 13 01:26:28.340747 master-0 kubenswrapper[19170]: I0313 01:26:28.340619 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2wh5w_30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3/manager/1.log" Mar 13 01:26:28.341742 master-0 kubenswrapper[19170]: I0313 01:26:28.341417 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" event={"ID":"30a7d5de-1ca1-46c8-8fbb-f34e4e2358d3","Type":"ContainerStarted","Data":"852a4608e23d5aa329e5b39b5b2dffe70aa14efbf84e62bb09545fd308eba5b6"} Mar 13 01:26:28.342073 master-0 kubenswrapper[19170]: I0313 01:26:28.342001 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:26:31.093916 master-0 kubenswrapper[19170]: E0313 01:26:31.093766 19170 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:26:21Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:26:21Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:26:21Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T01:26:21Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:26:31.829190 master-0 kubenswrapper[19170]: I0313 01:26:31.829106 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:31.829190 master-0 kubenswrapper[19170]: I0313 01:26:31.829184 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:33.390051 master-0 kubenswrapper[19170]: I0313 01:26:33.389962 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/cluster-cloud-controller-manager/0.log" Mar 13 01:26:33.390904 master-0 kubenswrapper[19170]: I0313 01:26:33.390054 19170 generic.go:334] "Generic (PLEG): container finished" podID="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" containerID="c20b7880d0e62c91ace04a400f15380d02a7f587227b0e579de54f8b6b881459" exitCode=1 Mar 13 01:26:33.390904 master-0 kubenswrapper[19170]: I0313 01:26:33.390116 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerDied","Data":"c20b7880d0e62c91ace04a400f15380d02a7f587227b0e579de54f8b6b881459"} Mar 13 01:26:33.391075 master-0 kubenswrapper[19170]: I0313 01:26:33.390967 19170 scope.go:117] "RemoveContainer" containerID="c20b7880d0e62c91ace04a400f15380d02a7f587227b0e579de54f8b6b881459" Mar 13 01:26:34.072394 master-0 kubenswrapper[19170]: E0313 01:26:34.072236 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 13 01:26:34.403279 master-0 kubenswrapper[19170]: I0313 01:26:34.403122 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/cluster-cloud-controller-manager/0.log" Mar 13 01:26:34.403279 master-0 kubenswrapper[19170]: I0313 01:26:34.403193 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"59a86a789ee4f9c521c0cdc6e8ff26c52f10f537aff54b37006fedbe93d1388f"} Mar 13 01:26:34.919101 master-0 kubenswrapper[19170]: I0313 01:26:34.919010 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:34.919451 master-0 kubenswrapper[19170]: I0313 01:26:34.919096 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:26:37.322052 master-0 kubenswrapper[19170]: I0313 01:26:37.321962 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" Mar 13 01:26:37.413293 master-0 kubenswrapper[19170]: I0313 01:26:37.413188 19170 status_manager.go:851] "Failed to get status for pod" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" pod="openshift-authentication/oauth-openshift-db987b46b-l4pxc" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods oauth-openshift-db987b46b-l4pxc)" Mar 13 01:26:37.433174 master-0 kubenswrapper[19170]: I0313 01:26:37.433101 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/1.log" Mar 13 01:26:37.434193 master-0 kubenswrapper[19170]: I0313 01:26:37.434133 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/0.log" Mar 13 01:26:37.434812 master-0 kubenswrapper[19170]: I0313 01:26:37.434718 19170 generic.go:334] "Generic (PLEG): container finished" podID="2c4c579b-0643-47ac-a729-017c326b0ecc" containerID="dbffdb32298050e3d786bea05b0e0e1b7922cd3d84a8dd8e9be8f2f907195c49" exitCode=1 Mar 13 01:26:37.434812 master-0 kubenswrapper[19170]: I0313 01:26:37.434786 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerDied","Data":"dbffdb32298050e3d786bea05b0e0e1b7922cd3d84a8dd8e9be8f2f907195c49"} Mar 13 01:26:37.435151 master-0 kubenswrapper[19170]: I0313 01:26:37.434848 19170 scope.go:117] "RemoveContainer" containerID="123227ec95c803688ea3b73355951521f6c2e9af5b64fd2cfda9372aa3bb75d9" Mar 13 01:26:37.435619 master-0 kubenswrapper[19170]: I0313 01:26:37.435554 19170 scope.go:117] "RemoveContainer" containerID="dbffdb32298050e3d786bea05b0e0e1b7922cd3d84a8dd8e9be8f2f907195c49" Mar 13 01:26:38.447959 master-0 kubenswrapper[19170]: I0313 01:26:38.447855 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/config-sync-controllers/0.log" Mar 13 01:26:38.449681 master-0 kubenswrapper[19170]: I0313 01:26:38.449588 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/cluster-cloud-controller-manager/0.log" Mar 13 01:26:38.449805 master-0 kubenswrapper[19170]: I0313 01:26:38.449713 19170 generic.go:334] "Generic (PLEG): container finished" podID="b3a9c0f6-cfde-4ae8-952a-00e2fb862482" containerID="705f3b1f8f6a29f9d66d96e7e64284c86692ae92fafef78a3e7d5b5411f4c2b9" exitCode=1 Mar 13 01:26:38.449884 master-0 kubenswrapper[19170]: I0313 01:26:38.449835 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerDied","Data":"705f3b1f8f6a29f9d66d96e7e64284c86692ae92fafef78a3e7d5b5411f4c2b9"} Mar 13 01:26:38.450922 master-0 kubenswrapper[19170]: I0313 01:26:38.450850 19170 scope.go:117] "RemoveContainer" containerID="705f3b1f8f6a29f9d66d96e7e64284c86692ae92fafef78a3e7d5b5411f4c2b9" Mar 13 01:26:38.454779 master-0 kubenswrapper[19170]: I0313 01:26:38.454134 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-7fc8j_2c4c579b-0643-47ac-a729-017c326b0ecc/manager/1.log" Mar 13 01:26:38.455178 master-0 kubenswrapper[19170]: I0313 01:26:38.454770 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" event={"ID":"2c4c579b-0643-47ac-a729-017c326b0ecc","Type":"ContainerStarted","Data":"820f88529efbe6fe233b7018781192c80bf8c9d17f0b534dbcf2dacc145a6e07"} Mar 13 01:26:38.455248 master-0 kubenswrapper[19170]: I0313 01:26:38.455221 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:26:39.467989 master-0 kubenswrapper[19170]: I0313 01:26:39.467908 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/config-sync-controllers/0.log" Mar 13 01:26:39.468941 master-0 kubenswrapper[19170]: I0313 01:26:39.468875 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n_b3a9c0f6-cfde-4ae8-952a-00e2fb862482/cluster-cloud-controller-manager/0.log" Mar 13 01:26:39.469073 master-0 kubenswrapper[19170]: I0313 01:26:39.469021 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-4nf8n" event={"ID":"b3a9c0f6-cfde-4ae8-952a-00e2fb862482","Type":"ContainerStarted","Data":"fea3cb6213084ddb0c353d804b4ed49870a96f0ec4e629daef1877b422b67eb0"} Mar 13 01:26:41.095609 master-0 kubenswrapper[19170]: E0313 01:26:41.095514 19170 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:26:41.488083 master-0 kubenswrapper[19170]: I0313 01:26:41.488038 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/2.log" Mar 13 01:26:41.488931 master-0 kubenswrapper[19170]: I0313 01:26:41.488883 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/1.log" Mar 13 01:26:41.489037 master-0 kubenswrapper[19170]: I0313 01:26:41.488953 19170 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459" exitCode=1 Mar 13 01:26:41.489037 master-0 kubenswrapper[19170]: I0313 01:26:41.488995 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459"} Mar 13 01:26:41.489170 master-0 kubenswrapper[19170]: I0313 01:26:41.489042 19170 scope.go:117] "RemoveContainer" containerID="63acc12abb985dff806c0ddcd9c27446b325df44a94150a2e781b1a28ec52869" Mar 13 01:26:41.489567 master-0 kubenswrapper[19170]: I0313 01:26:41.489538 19170 scope.go:117] "RemoveContainer" containerID="9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459" Mar 13 01:26:41.829849 master-0 kubenswrapper[19170]: I0313 01:26:41.829756 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:41.829849 master-0 kubenswrapper[19170]: I0313 01:26:41.829846 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:42.499166 master-0 kubenswrapper[19170]: I0313 01:26:42.499089 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/2.log" Mar 13 01:26:42.500057 master-0 kubenswrapper[19170]: I0313 01:26:42.499189 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984"} Mar 13 01:26:44.662789 master-0 kubenswrapper[19170]: I0313 01:26:44.662715 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-7fc8j" Mar 13 01:26:44.919341 master-0 kubenswrapper[19170]: I0313 01:26:44.919152 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:44.919341 master-0 kubenswrapper[19170]: I0313 01:26:44.919311 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:26:50.472759 master-0 kubenswrapper[19170]: E0313 01:26:50.472621 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 13 01:26:51.096954 master-0 kubenswrapper[19170]: E0313 01:26:51.096838 19170 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:26:51.829457 master-0 kubenswrapper[19170]: I0313 01:26:51.829369 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:26:51.830248 master-0 kubenswrapper[19170]: I0313 01:26:51.829454 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:26:52.603497 master-0 kubenswrapper[19170]: E0313 01:26:52.603314 19170 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 13 01:26:52.603497 master-0 kubenswrapper[19170]: &Event{ObjectMeta:{console-864f84b8db-z7bgh.189c42107b45c7a4 openshift-console 15768 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-864f84b8db-z7bgh,UID:cdc3c693-5b70-44ef-b53d-7a546edd268c,APIVersion:v1,ResourceVersion:14038,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.97:8443/health": dial tcp 10.128.0.97:8443: connect: connection refused Mar 13 01:26:52.603497 master-0 kubenswrapper[19170]: body: Mar 13 01:26:52.603497 master-0 kubenswrapper[19170]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:23:11 +0000 UTC,LastTimestamp:2026-03-13 01:24:41.828975269 +0000 UTC m=+342.637096259,Count:10,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 13 01:26:52.603497 master-0 kubenswrapper[19170]: > Mar 13 01:26:54.919338 master-0 kubenswrapper[19170]: I0313 01:26:54.919273 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:26:54.920164 master-0 kubenswrapper[19170]: I0313 01:26:54.919348 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:00.318123 master-0 kubenswrapper[19170]: E0313 01:27:00.317853 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:00.672994 master-0 kubenswrapper[19170]: I0313 01:27:00.672944 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"aee306fca9978afa8831451dd5d203b1b3b87383f590a39abdf8e6f041264e01"} Mar 13 01:27:00.673326 master-0 kubenswrapper[19170]: I0313 01:27:00.673284 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:27:00.673326 master-0 kubenswrapper[19170]: I0313 01:27:00.673301 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:27:01.098251 master-0 kubenswrapper[19170]: E0313 01:27:01.098147 19170 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:01.685538 master-0 kubenswrapper[19170]: I0313 01:27:01.685448 19170 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="aee306fca9978afa8831451dd5d203b1b3b87383f590a39abdf8e6f041264e01" exitCode=0 Mar 13 01:27:01.686159 master-0 kubenswrapper[19170]: I0313 01:27:01.685548 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"aee306fca9978afa8831451dd5d203b1b3b87383f590a39abdf8e6f041264e01"} Mar 13 01:27:01.829515 master-0 kubenswrapper[19170]: I0313 01:27:01.829445 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:01.829802 master-0 kubenswrapper[19170]: I0313 01:27:01.829535 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:02.700821 master-0 kubenswrapper[19170]: I0313 01:27:02.700705 19170 generic.go:334] "Generic (PLEG): container finished" podID="1308fba1-a50d-48b3-b272-7bef44727b7f" containerID="f3373c04dfe5b06ce5689672c5fa9716a2e2ff1f88c17517721cb216726a9cc3" exitCode=0 Mar 13 01:27:02.700821 master-0 kubenswrapper[19170]: I0313 01:27:02.700716 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerDied","Data":"f3373c04dfe5b06ce5689672c5fa9716a2e2ff1f88c17517721cb216726a9cc3"} Mar 13 01:27:02.701982 master-0 kubenswrapper[19170]: I0313 01:27:02.700904 19170 scope.go:117] "RemoveContainer" containerID="0743267ed603ff9ac6856c0cefcbee7aeda537c3eb2e7c92d9ca09c61963bcad" Mar 13 01:27:02.702108 master-0 kubenswrapper[19170]: I0313 01:27:02.701984 19170 scope.go:117] "RemoveContainer" containerID="f3373c04dfe5b06ce5689672c5fa9716a2e2ff1f88c17517721cb216726a9cc3" Mar 13 01:27:03.713808 master-0 kubenswrapper[19170]: I0313 01:27:03.713689 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-cjmvd" event={"ID":"1308fba1-a50d-48b3-b272-7bef44727b7f","Type":"ContainerStarted","Data":"6ac023861c613003192c9d9716da10b45dc60037f42dac5079a93bc9479d7311"} Mar 13 01:27:04.919791 master-0 kubenswrapper[19170]: I0313 01:27:04.919744 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:04.920318 master-0 kubenswrapper[19170]: I0313 01:27:04.919820 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:06.743568 master-0 kubenswrapper[19170]: I0313 01:27:06.743435 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/1.log" Mar 13 01:27:06.744904 master-0 kubenswrapper[19170]: I0313 01:27:06.744850 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/0.log" Mar 13 01:27:06.745069 master-0 kubenswrapper[19170]: I0313 01:27:06.744918 19170 generic.go:334] "Generic (PLEG): container finished" podID="8c6bf2d5-1881-4b63-b247-7e7426707fa1" containerID="3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963" exitCode=1 Mar 13 01:27:06.745069 master-0 kubenswrapper[19170]: I0313 01:27:06.744957 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerDied","Data":"3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963"} Mar 13 01:27:06.745069 master-0 kubenswrapper[19170]: I0313 01:27:06.745041 19170 scope.go:117] "RemoveContainer" containerID="4f6218cc31120287103af84277d2f84084c2bfa51936dccb9cf930aa141d948d" Mar 13 01:27:06.746153 master-0 kubenswrapper[19170]: I0313 01:27:06.746087 19170 scope.go:117] "RemoveContainer" containerID="3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963" Mar 13 01:27:07.473588 master-0 kubenswrapper[19170]: E0313 01:27:07.473518 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:27:07.757994 master-0 kubenswrapper[19170]: I0313 01:27:07.757814 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/1.log" Mar 13 01:27:07.758886 master-0 kubenswrapper[19170]: I0313 01:27:07.758623 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"82755ddd8d0a56daa9b99df9ea0695a83707b2a72c5516360ce0e95bb8437a37"} Mar 13 01:27:08.769581 master-0 kubenswrapper[19170]: I0313 01:27:08.769512 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-w6qs7_cada5bf2-e208-4fd8-bdf5-de8cad31a665/control-plane-machine-set-operator/0.log" Mar 13 01:27:08.770264 master-0 kubenswrapper[19170]: I0313 01:27:08.769596 19170 generic.go:334] "Generic (PLEG): container finished" podID="cada5bf2-e208-4fd8-bdf5-de8cad31a665" containerID="304e8e0944434514608c776ed75bc07cb5d1c2603e8ab5214e26636517baa5e9" exitCode=1 Mar 13 01:27:08.770264 master-0 kubenswrapper[19170]: I0313 01:27:08.769673 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerDied","Data":"304e8e0944434514608c776ed75bc07cb5d1c2603e8ab5214e26636517baa5e9"} Mar 13 01:27:08.770548 master-0 kubenswrapper[19170]: I0313 01:27:08.770410 19170 scope.go:117] "RemoveContainer" containerID="304e8e0944434514608c776ed75bc07cb5d1c2603e8ab5214e26636517baa5e9" Mar 13 01:27:09.818726 master-0 kubenswrapper[19170]: I0313 01:27:09.818611 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-w6qs7_cada5bf2-e208-4fd8-bdf5-de8cad31a665/control-plane-machine-set-operator/0.log" Mar 13 01:27:09.819513 master-0 kubenswrapper[19170]: I0313 01:27:09.818851 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-w6qs7" event={"ID":"cada5bf2-e208-4fd8-bdf5-de8cad31a665","Type":"ContainerStarted","Data":"62bb9e8ed8d1ed06ad85dda864cee9923c9d389d432d56466ba6c325d2b5be3b"} Mar 13 01:27:09.823745 master-0 kubenswrapper[19170]: I0313 01:27:09.823683 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_782d5ee25fbcc8a1fef3f1955932cf63/kube-scheduler/0.log" Mar 13 01:27:09.824769 master-0 kubenswrapper[19170]: I0313 01:27:09.824698 19170 generic.go:334] "Generic (PLEG): container finished" podID="782d5ee25fbcc8a1fef3f1955932cf63" containerID="44a5d925ffd004246a4349660cad574513d9d5098376b6ab6e4cf763813f9a22" exitCode=1 Mar 13 01:27:09.824867 master-0 kubenswrapper[19170]: I0313 01:27:09.824783 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerDied","Data":"44a5d925ffd004246a4349660cad574513d9d5098376b6ab6e4cf763813f9a22"} Mar 13 01:27:09.825666 master-0 kubenswrapper[19170]: I0313 01:27:09.825600 19170 scope.go:117] "RemoveContainer" containerID="44a5d925ffd004246a4349660cad574513d9d5098376b6ab6e4cf763813f9a22" Mar 13 01:27:10.845225 master-0 kubenswrapper[19170]: I0313 01:27:10.845157 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_782d5ee25fbcc8a1fef3f1955932cf63/kube-scheduler/0.log" Mar 13 01:27:10.846015 master-0 kubenswrapper[19170]: I0313 01:27:10.845829 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"782d5ee25fbcc8a1fef3f1955932cf63","Type":"ContainerStarted","Data":"cf2d067eec1f83e1dcefe4c5c22f40ee882fab7a458f56289ba9bb093a91abb7"} Mar 13 01:27:10.846276 master-0 kubenswrapper[19170]: I0313 01:27:10.846201 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:27:11.099157 master-0 kubenswrapper[19170]: E0313 01:27:11.098899 19170 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:11.099157 master-0 kubenswrapper[19170]: E0313 01:27:11.098972 19170 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 01:27:11.829389 master-0 kubenswrapper[19170]: I0313 01:27:11.829281 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:11.829627 master-0 kubenswrapper[19170]: I0313 01:27:11.829472 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:11.863422 master-0 kubenswrapper[19170]: I0313 01:27:11.863291 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/3.log" Mar 13 01:27:11.864331 master-0 kubenswrapper[19170]: I0313 01:27:11.864188 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/2.log" Mar 13 01:27:11.864331 master-0 kubenswrapper[19170]: I0313 01:27:11.864239 19170 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984" exitCode=1 Mar 13 01:27:11.864331 master-0 kubenswrapper[19170]: I0313 01:27:11.864318 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984"} Mar 13 01:27:11.864608 master-0 kubenswrapper[19170]: I0313 01:27:11.864370 19170 scope.go:117] "RemoveContainer" containerID="9c9bcd7c0bc2c0a8bdfb783091d79282cb414b9c8007f3f521bcaba6d62d5459" Mar 13 01:27:11.865159 master-0 kubenswrapper[19170]: I0313 01:27:11.865096 19170 scope.go:117] "RemoveContainer" containerID="0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984" Mar 13 01:27:11.865505 master-0 kubenswrapper[19170]: E0313 01:27:11.865446 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:27:11.870021 master-0 kubenswrapper[19170]: I0313 01:27:11.869961 19170 generic.go:334] "Generic (PLEG): container finished" podID="6745198c-7559-45e5-af6c-1eb493a0a496" containerID="0144943a2c5123b0b8e5c45e867c4752dde3ef22f384e51497f03888456d16e5" exitCode=0 Mar 13 01:27:11.870948 master-0 kubenswrapper[19170]: I0313 01:27:11.870837 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" event={"ID":"6745198c-7559-45e5-af6c-1eb493a0a496","Type":"ContainerDied","Data":"0144943a2c5123b0b8e5c45e867c4752dde3ef22f384e51497f03888456d16e5"} Mar 13 01:27:11.872593 master-0 kubenswrapper[19170]: I0313 01:27:11.872500 19170 scope.go:117] "RemoveContainer" containerID="0144943a2c5123b0b8e5c45e867c4752dde3ef22f384e51497f03888456d16e5" Mar 13 01:27:12.881472 master-0 kubenswrapper[19170]: I0313 01:27:12.881397 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/3.log" Mar 13 01:27:12.884657 master-0 kubenswrapper[19170]: I0313 01:27:12.884551 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" event={"ID":"6745198c-7559-45e5-af6c-1eb493a0a496","Type":"ContainerStarted","Data":"7266693ba64cdad9b5076393334c9ed9a9f30861300cec18c695ce15ce57c9ea"} Mar 13 01:27:12.885139 master-0 kubenswrapper[19170]: I0313 01:27:12.885077 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:27:12.891759 master-0 kubenswrapper[19170]: I0313 01:27:12.891694 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d8dbf7c4d-v2gdg" Mar 13 01:27:14.919572 master-0 kubenswrapper[19170]: I0313 01:27:14.919471 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:14.919572 master-0 kubenswrapper[19170]: I0313 01:27:14.919547 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:15.923563 master-0 kubenswrapper[19170]: I0313 01:27:15.923482 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:27:15.925994 master-0 kubenswrapper[19170]: I0313 01:27:15.925915 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="682183e6d4ff8fe19dbc6c9d30df58dc630f77f76ed3714d2f40580e0122e18e" exitCode=0 Mar 13 01:27:15.925994 master-0 kubenswrapper[19170]: I0313 01:27:15.925979 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"682183e6d4ff8fe19dbc6c9d30df58dc630f77f76ed3714d2f40580e0122e18e"} Mar 13 01:27:15.926904 master-0 kubenswrapper[19170]: I0313 01:27:15.926857 19170 scope.go:117] "RemoveContainer" containerID="682183e6d4ff8fe19dbc6c9d30df58dc630f77f76ed3714d2f40580e0122e18e" Mar 13 01:27:16.483123 master-0 kubenswrapper[19170]: I0313 01:27:16.482579 19170 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:16.483123 master-0 kubenswrapper[19170]: I0313 01:27:16.482706 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:16.483123 master-0 kubenswrapper[19170]: I0313 01:27:16.482788 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:16.939958 master-0 kubenswrapper[19170]: I0313 01:27:16.939849 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:27:16.941623 master-0 kubenswrapper[19170]: I0313 01:27:16.941542 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4"} Mar 13 01:27:17.954698 master-0 kubenswrapper[19170]: I0313 01:27:17.954600 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-knlw8_33dfdc31-54a4-4249-99ae-a15180514659/machine-approver-controller/0.log" Mar 13 01:27:17.956153 master-0 kubenswrapper[19170]: I0313 01:27:17.956086 19170 generic.go:334] "Generic (PLEG): container finished" podID="33dfdc31-54a4-4249-99ae-a15180514659" containerID="d4128612049d2903866c89ea3ac616fb89c5c7677c3ff52ca9d870714f95087e" exitCode=255 Mar 13 01:27:17.956410 master-0 kubenswrapper[19170]: I0313 01:27:17.956208 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerDied","Data":"d4128612049d2903866c89ea3ac616fb89c5c7677c3ff52ca9d870714f95087e"} Mar 13 01:27:17.957709 master-0 kubenswrapper[19170]: I0313 01:27:17.957665 19170 scope.go:117] "RemoveContainer" containerID="d4128612049d2903866c89ea3ac616fb89c5c7677c3ff52ca9d870714f95087e" Mar 13 01:27:18.969530 master-0 kubenswrapper[19170]: I0313 01:27:18.969444 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-knlw8_33dfdc31-54a4-4249-99ae-a15180514659/machine-approver-controller/0.log" Mar 13 01:27:18.970411 master-0 kubenswrapper[19170]: I0313 01:27:18.970156 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-knlw8" event={"ID":"33dfdc31-54a4-4249-99ae-a15180514659","Type":"ContainerStarted","Data":"7ee416c87c8aaca699453795dc83623827a3d0d5c4f66409f30bad7037a816a0"} Mar 13 01:27:21.829350 master-0 kubenswrapper[19170]: I0313 01:27:21.829246 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:21.829350 master-0 kubenswrapper[19170]: I0313 01:27:21.829331 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:22.420677 master-0 kubenswrapper[19170]: I0313 01:27:22.419376 19170 scope.go:117] "RemoveContainer" containerID="0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984" Mar 13 01:27:23.010147 master-0 kubenswrapper[19170]: I0313 01:27:23.010019 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/3.log" Mar 13 01:27:23.011014 master-0 kubenswrapper[19170]: I0313 01:27:23.010174 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30"} Mar 13 01:27:24.474953 master-0 kubenswrapper[19170]: E0313 01:27:24.474731 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:27:24.919831 master-0 kubenswrapper[19170]: I0313 01:27:24.919765 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:24.920044 master-0 kubenswrapper[19170]: I0313 01:27:24.919844 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:26.482590 master-0 kubenswrapper[19170]: I0313 01:27:26.482513 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:26.483529 master-0 kubenswrapper[19170]: I0313 01:27:26.482697 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:26.607430 master-0 kubenswrapper[19170]: E0313 01:27:26.607182 19170 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{console-864f84b8db-z7bgh.189c42107b47470d openshift-console 15769 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-864f84b8db-z7bgh,UID:cdc3c693-5b70-44ef-b53d-7a546edd268c,APIVersion:v1,ResourceVersion:14038,FieldPath:spec.containers{console},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:23:11 +0000 UTC,LastTimestamp:2026-03-13 01:24:41.829044951 +0000 UTC m=+342.637165941,Count:10,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:27:29.482832 master-0 kubenswrapper[19170]: I0313 01:27:29.482744 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:27:29.482832 master-0 kubenswrapper[19170]: I0313 01:27:29.482824 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:31.829468 master-0 kubenswrapper[19170]: I0313 01:27:31.829359 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:31.829468 master-0 kubenswrapper[19170]: I0313 01:27:31.829458 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:34.676457 master-0 kubenswrapper[19170]: E0313 01:27:34.676368 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:34.919987 master-0 kubenswrapper[19170]: I0313 01:27:34.919882 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:34.920237 master-0 kubenswrapper[19170]: I0313 01:27:34.920015 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:35.129243 master-0 kubenswrapper[19170]: I0313 01:27:35.129156 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"c0c827332eaa5a8d6ac97b6c39c773a59f70da4d57ce12a748fe8c5eaaddd932"} Mar 13 01:27:36.151381 master-0 kubenswrapper[19170]: I0313 01:27:36.151207 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"e7b5abd22738604e1c24fc7a424c2e2fdfedc53033207233fa8242e21795fd07"} Mar 13 01:27:36.151381 master-0 kubenswrapper[19170]: I0313 01:27:36.151268 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"3f99aecbd2ad0b75a78dfb5619d7f940275bd794146c8956a0e5dfa1055a21cd"} Mar 13 01:27:36.151381 master-0 kubenswrapper[19170]: I0313 01:27:36.151293 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"6bfdcc245538ed5e5c5682bf0a7b26977c99ed64cb819451ce278697f3c1f8cd"} Mar 13 01:27:37.171102 master-0 kubenswrapper[19170]: I0313 01:27:37.171017 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"f59179583a4ecbdeb3fc9a357d190301f170eb392ba75ffdd246ae236ffb22cc"} Mar 13 01:27:37.171901 master-0 kubenswrapper[19170]: I0313 01:27:37.171441 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:27:37.171901 master-0 kubenswrapper[19170]: I0313 01:27:37.171479 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:27:37.432782 master-0 kubenswrapper[19170]: I0313 01:27:37.432488 19170 status_manager.go:851] "Failed to get status for pod" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Mar 13 01:27:39.484156 master-0 kubenswrapper[19170]: I0313 01:27:39.484057 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:27:39.484974 master-0 kubenswrapper[19170]: I0313 01:27:39.484166 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:40.451590 master-0 kubenswrapper[19170]: I0313 01:27:40.451492 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:40.451590 master-0 kubenswrapper[19170]: I0313 01:27:40.451565 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:41.475981 master-0 kubenswrapper[19170]: E0313 01:27:41.475876 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 01:27:41.829766 master-0 kubenswrapper[19170]: I0313 01:27:41.829547 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:41.829766 master-0 kubenswrapper[19170]: I0313 01:27:41.829662 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:45.448239 master-0 kubenswrapper[19170]: I0313 01:27:45.448171 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:45.448804 master-0 kubenswrapper[19170]: I0313 01:27:45.448249 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:45.450454 master-0 kubenswrapper[19170]: I0313 01:27:45.450410 19170 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:27:45.450535 master-0 kubenswrapper[19170]: I0313 01:27:45.450500 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:45.461174 master-0 kubenswrapper[19170]: I0313 01:27:45.461004 19170 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-bqmmf container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 01:27:45.461174 master-0 kubenswrapper[19170]: I0313 01:27:45.461089 19170 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-bqmmf" podUID="d56480e0-0885-41e5-a1fc-931a068fbadb" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:27:46.754077 master-0 kubenswrapper[19170]: I0313 01:27:46.754009 19170 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:47966->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 13 01:27:46.754847 master-0 kubenswrapper[19170]: I0313 01:27:46.754098 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:47966->127.0.0.1:10357: read: connection reset by peer" Mar 13 01:27:46.754847 master-0 kubenswrapper[19170]: I0313 01:27:46.754175 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:46.755717 master-0 kubenswrapper[19170]: I0313 01:27:46.755315 19170 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 01:27:46.755717 master-0 kubenswrapper[19170]: I0313 01:27:46.755465 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" containerID="cri-o://b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4" gracePeriod=30 Mar 13 01:27:47.469445 master-0 kubenswrapper[19170]: I0313 01:27:47.469401 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:27:47.470542 master-0 kubenswrapper[19170]: I0313 01:27:47.470506 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:27:47.471372 master-0 kubenswrapper[19170]: I0313 01:27:47.471342 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4" exitCode=255 Mar 13 01:27:47.471435 master-0 kubenswrapper[19170]: I0313 01:27:47.471381 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4"} Mar 13 01:27:47.471435 master-0 kubenswrapper[19170]: I0313 01:27:47.471418 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"c2b7c2cb0043a7acf4d6058427bbcbbdcf69fdb50f1988ca8bbd9592ceaee922"} Mar 13 01:27:47.471502 master-0 kubenswrapper[19170]: I0313 01:27:47.471435 19170 scope.go:117] "RemoveContainer" containerID="682183e6d4ff8fe19dbc6c9d30df58dc630f77f76ed3714d2f40580e0122e18e" Mar 13 01:27:48.482894 master-0 kubenswrapper[19170]: I0313 01:27:48.482801 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:27:48.484369 master-0 kubenswrapper[19170]: I0313 01:27:48.484310 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:27:50.490196 master-0 kubenswrapper[19170]: I0313 01:27:50.490145 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:51.829324 master-0 kubenswrapper[19170]: I0313 01:27:51.829222 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:27:51.829324 master-0 kubenswrapper[19170]: I0313 01:27:51.829301 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:27:53.529302 master-0 kubenswrapper[19170]: I0313 01:27:53.529230 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/4.log" Mar 13 01:27:53.530048 master-0 kubenswrapper[19170]: I0313 01:27:53.529993 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/3.log" Mar 13 01:27:53.530142 master-0 kubenswrapper[19170]: I0313 01:27:53.530107 19170 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30" exitCode=1 Mar 13 01:27:53.530220 master-0 kubenswrapper[19170]: I0313 01:27:53.530154 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30"} Mar 13 01:27:53.530220 master-0 kubenswrapper[19170]: I0313 01:27:53.530205 19170 scope.go:117] "RemoveContainer" containerID="0e73e20a8c061b10b8a7d490c2d00bb7fd7366b7509a1c085dd600ceaa7bf984" Mar 13 01:27:53.531565 master-0 kubenswrapper[19170]: I0313 01:27:53.531519 19170 scope.go:117] "RemoveContainer" containerID="883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30" Mar 13 01:27:53.532875 master-0 kubenswrapper[19170]: E0313 01:27:53.532819 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:27:54.541597 master-0 kubenswrapper[19170]: I0313 01:27:54.541499 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/4.log" Mar 13 01:27:54.919195 master-0 kubenswrapper[19170]: I0313 01:27:54.919101 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:27:54.919513 master-0 kubenswrapper[19170]: I0313 01:27:54.919198 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:27:55.477179 master-0 kubenswrapper[19170]: I0313 01:27:55.477094 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 01:27:56.483110 master-0 kubenswrapper[19170]: I0313 01:27:56.483020 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:56.484098 master-0 kubenswrapper[19170]: I0313 01:27:56.483131 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:58.888250 master-0 kubenswrapper[19170]: I0313 01:27:58.888182 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:27:58.901227 master-0 kubenswrapper[19170]: I0313 01:27:58.901173 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:01.828711 master-0 kubenswrapper[19170]: I0313 01:28:01.828594 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:01.829581 master-0 kubenswrapper[19170]: I0313 01:28:01.828708 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:01.829581 master-0 kubenswrapper[19170]: I0313 01:28:01.828786 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:28:01.830140 master-0 kubenswrapper[19170]: I0313 01:28:01.829669 19170 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2"} pod="openshift-console/console-864f84b8db-z7bgh" containerMessage="Container console failed startup probe, will be restarted" Mar 13 01:28:02.627530 master-0 kubenswrapper[19170]: E0313 01:28:02.627454 19170 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-864f84b8db-z7bgh" message="" Mar 13 01:28:02.627530 master-0 kubenswrapper[19170]: E0313 01:28:02.627500 19170 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" containerID="cri-o://a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" Mar 13 01:28:02.627530 master-0 kubenswrapper[19170]: I0313 01:28:02.627540 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" containerID="cri-o://a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" gracePeriod=40 Mar 13 01:28:03.489901 master-0 kubenswrapper[19170]: I0313 01:28:03.489814 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 01:28:03.632817 master-0 kubenswrapper[19170]: I0313 01:28:03.632716 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-864f84b8db-z7bgh_cdc3c693-5b70-44ef-b53d-7a546edd268c/console/0.log" Mar 13 01:28:03.632817 master-0 kubenswrapper[19170]: I0313 01:28:03.632809 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerID="a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" exitCode=255 Mar 13 01:28:03.633288 master-0 kubenswrapper[19170]: I0313 01:28:03.632945 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerDied","Data":"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2"} Mar 13 01:28:03.633288 master-0 kubenswrapper[19170]: I0313 01:28:03.633052 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerStarted","Data":"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f"} Mar 13 01:28:04.919404 master-0 kubenswrapper[19170]: I0313 01:28:04.919311 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:04.919404 master-0 kubenswrapper[19170]: I0313 01:28:04.919384 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:06.419509 master-0 kubenswrapper[19170]: I0313 01:28:06.419422 19170 scope.go:117] "RemoveContainer" containerID="883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30" Mar 13 01:28:06.420446 master-0 kubenswrapper[19170]: E0313 01:28:06.419719 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:28:11.174449 master-0 kubenswrapper[19170]: E0313 01:28:11.174365 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:28:11.699210 master-0 kubenswrapper[19170]: I0313 01:28:11.699159 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:28:11.699517 master-0 kubenswrapper[19170]: I0313 01:28:11.699493 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:28:11.828475 master-0 kubenswrapper[19170]: I0313 01:28:11.828363 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:28:11.828475 master-0 kubenswrapper[19170]: I0313 01:28:11.828439 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:28:11.829223 master-0 kubenswrapper[19170]: I0313 01:28:11.829172 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:11.829452 master-0 kubenswrapper[19170]: I0313 01:28:11.829407 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:14.919560 master-0 kubenswrapper[19170]: I0313 01:28:14.919466 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:14.920555 master-0 kubenswrapper[19170]: I0313 01:28:14.919569 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:18.419151 master-0 kubenswrapper[19170]: I0313 01:28:18.419089 19170 scope.go:117] "RemoveContainer" containerID="883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30" Mar 13 01:28:18.767076 master-0 kubenswrapper[19170]: I0313 01:28:18.766907 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/4.log" Mar 13 01:28:18.767076 master-0 kubenswrapper[19170]: I0313 01:28:18.766999 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13"} Mar 13 01:28:21.828856 master-0 kubenswrapper[19170]: I0313 01:28:21.828777 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:21.829938 master-0 kubenswrapper[19170]: I0313 01:28:21.828870 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:22.809040 master-0 kubenswrapper[19170]: I0313 01:28:22.808944 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:28:22.809908 master-0 kubenswrapper[19170]: I0313 01:28:22.809831 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/2.log" Mar 13 01:28:22.810929 master-0 kubenswrapper[19170]: I0313 01:28:22.810862 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/1.log" Mar 13 01:28:22.811897 master-0 kubenswrapper[19170]: I0313 01:28:22.811838 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" exitCode=1 Mar 13 01:28:22.812027 master-0 kubenswrapper[19170]: I0313 01:28:22.811897 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d"} Mar 13 01:28:22.812027 master-0 kubenswrapper[19170]: I0313 01:28:22.811957 19170 scope.go:117] "RemoveContainer" containerID="46c26b717b497322454dcf7c105249ed590c5f0f850f5c9e1de33f73e6f55637" Mar 13 01:28:22.812811 master-0 kubenswrapper[19170]: I0313 01:28:22.812761 19170 scope.go:117] "RemoveContainer" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" Mar 13 01:28:22.815255 master-0 kubenswrapper[19170]: E0313 01:28:22.815127 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:28:23.824816 master-0 kubenswrapper[19170]: I0313 01:28:23.824738 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:28:23.825910 master-0 kubenswrapper[19170]: I0313 01:28:23.825591 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/2.log" Mar 13 01:28:24.919227 master-0 kubenswrapper[19170]: I0313 01:28:24.919141 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:24.919227 master-0 kubenswrapper[19170]: I0313 01:28:24.919221 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:26.482922 master-0 kubenswrapper[19170]: I0313 01:28:26.482811 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:26.482922 master-0 kubenswrapper[19170]: I0313 01:28:26.482892 19170 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:26.482922 master-0 kubenswrapper[19170]: I0313 01:28:26.482917 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:26.483968 master-0 kubenswrapper[19170]: I0313 01:28:26.483722 19170 scope.go:117] "RemoveContainer" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" Mar 13 01:28:26.484320 master-0 kubenswrapper[19170]: E0313 01:28:26.484272 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:28:26.854610 master-0 kubenswrapper[19170]: I0313 01:28:26.854419 19170 scope.go:117] "RemoveContainer" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" Mar 13 01:28:26.855297 master-0 kubenswrapper[19170]: E0313 01:28:26.855238 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:28:31.828880 master-0 kubenswrapper[19170]: I0313 01:28:31.828724 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:31.828880 master-0 kubenswrapper[19170]: I0313 01:28:31.828808 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:34.920425 master-0 kubenswrapper[19170]: I0313 01:28:34.920184 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:34.920425 master-0 kubenswrapper[19170]: I0313 01:28:34.920271 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:37.434996 master-0 kubenswrapper[19170]: I0313 01:28:37.434911 19170 status_manager.go:851] "Failed to get status for pod" podUID="d1153bb3-30dd-458f-b0a4-c05358a8b3f8" pod="openshift-network-node-identity/network-node-identity-znqwc" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-znqwc)" Mar 13 01:28:41.829282 master-0 kubenswrapper[19170]: I0313 01:28:41.829196 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:41.829282 master-0 kubenswrapper[19170]: I0313 01:28:41.829277 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:42.420063 master-0 kubenswrapper[19170]: I0313 01:28:42.420005 19170 scope.go:117] "RemoveContainer" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" Mar 13 01:28:43.012132 master-0 kubenswrapper[19170]: I0313 01:28:43.011989 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:28:43.012915 master-0 kubenswrapper[19170]: I0313 01:28:43.012439 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/2.log" Mar 13 01:28:43.013190 master-0 kubenswrapper[19170]: I0313 01:28:43.013152 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e"} Mar 13 01:28:44.919884 master-0 kubenswrapper[19170]: I0313 01:28:44.919782 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:44.920754 master-0 kubenswrapper[19170]: I0313 01:28:44.919981 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:44.920754 master-0 kubenswrapper[19170]: I0313 01:28:44.920076 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:28:44.921172 master-0 kubenswrapper[19170]: I0313 01:28:44.921109 19170 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36"} pod="openshift-console/console-6c969fc7db-l2cgv" containerMessage="Container console failed startup probe, will be restarted" Mar 13 01:28:45.702915 master-0 kubenswrapper[19170]: E0313 01:28:45.702818 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 01:28:46.483445 master-0 kubenswrapper[19170]: I0313 01:28:46.483338 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:46.484347 master-0 kubenswrapper[19170]: I0313 01:28:46.484037 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:46.491020 master-0 kubenswrapper[19170]: I0313 01:28:46.490934 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:47.338795 master-0 kubenswrapper[19170]: E0313 01:28:47.338674 19170 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-6c969fc7db-l2cgv" message="" Mar 13 01:28:47.339592 master-0 kubenswrapper[19170]: E0313 01:28:47.338836 19170 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" containerID="cri-o://4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36" Mar 13 01:28:47.339592 master-0 kubenswrapper[19170]: I0313 01:28:47.338914 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" containerID="cri-o://4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36" gracePeriod=38 Mar 13 01:28:48.060901 master-0 kubenswrapper[19170]: I0313 01:28:48.060809 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c969fc7db-l2cgv_5517e790-8931-405e-a113-b4d76156775c/console/0.log" Mar 13 01:28:48.060901 master-0 kubenswrapper[19170]: I0313 01:28:48.060890 19170 generic.go:334] "Generic (PLEG): container finished" podID="5517e790-8931-405e-a113-b4d76156775c" containerID="4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36" exitCode=255 Mar 13 01:28:48.062005 master-0 kubenswrapper[19170]: I0313 01:28:48.060988 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerDied","Data":"4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36"} Mar 13 01:28:48.062005 master-0 kubenswrapper[19170]: I0313 01:28:48.061030 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerStarted","Data":"41d1bd37ad97da541b70877816e85ac012c2da5f02e8d22110de3185141c4de4"} Mar 13 01:28:48.063586 master-0 kubenswrapper[19170]: I0313 01:28:48.063527 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/2.log" Mar 13 01:28:48.064498 master-0 kubenswrapper[19170]: I0313 01:28:48.064448 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/1.log" Mar 13 01:28:48.065089 master-0 kubenswrapper[19170]: I0313 01:28:48.065036 19170 generic.go:334] "Generic (PLEG): container finished" podID="8c6bf2d5-1881-4b63-b247-7e7426707fa1" containerID="82755ddd8d0a56daa9b99df9ea0695a83707b2a72c5516360ce0e95bb8437a37" exitCode=1 Mar 13 01:28:48.065175 master-0 kubenswrapper[19170]: I0313 01:28:48.065087 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerDied","Data":"82755ddd8d0a56daa9b99df9ea0695a83707b2a72c5516360ce0e95bb8437a37"} Mar 13 01:28:48.065252 master-0 kubenswrapper[19170]: I0313 01:28:48.065188 19170 scope.go:117] "RemoveContainer" containerID="3ae103f8a6f884755c35a6a16f2094f5be91f7ff8edc4b19322c844eaa733963" Mar 13 01:28:48.066221 master-0 kubenswrapper[19170]: I0313 01:28:48.066172 19170 scope.go:117] "RemoveContainer" containerID="82755ddd8d0a56daa9b99df9ea0695a83707b2a72c5516360ce0e95bb8437a37" Mar 13 01:28:48.066715 master-0 kubenswrapper[19170]: E0313 01:28:48.066662 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-47sjr_openshift-machine-api(8c6bf2d5-1881-4b63-b247-7e7426707fa1)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" podUID="8c6bf2d5-1881-4b63-b247-7e7426707fa1" Mar 13 01:28:49.088425 master-0 kubenswrapper[19170]: I0313 01:28:49.088307 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/2.log" Mar 13 01:28:49.093690 master-0 kubenswrapper[19170]: I0313 01:28:49.093605 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/5.log" Mar 13 01:28:49.095318 master-0 kubenswrapper[19170]: I0313 01:28:49.095287 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/4.log" Mar 13 01:28:49.095612 master-0 kubenswrapper[19170]: I0313 01:28:49.095576 19170 generic.go:334] "Generic (PLEG): container finished" podID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" exitCode=1 Mar 13 01:28:49.095808 master-0 kubenswrapper[19170]: I0313 01:28:49.095696 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerDied","Data":"70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13"} Mar 13 01:28:49.095992 master-0 kubenswrapper[19170]: I0313 01:28:49.095969 19170 scope.go:117] "RemoveContainer" containerID="883ee0eea52d5ba0db2b2d3a5f7ce59b38891a24c758260d3e8baf54e7aa0f30" Mar 13 01:28:49.096976 master-0 kubenswrapper[19170]: I0313 01:28:49.096932 19170 scope.go:117] "RemoveContainer" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" Mar 13 01:28:49.097355 master-0 kubenswrapper[19170]: E0313 01:28:49.097311 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:28:50.108053 master-0 kubenswrapper[19170]: I0313 01:28:50.107971 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/5.log" Mar 13 01:28:51.828980 master-0 kubenswrapper[19170]: I0313 01:28:51.828885 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:28:51.829915 master-0 kubenswrapper[19170]: I0313 01:28:51.828990 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:28:54.918461 master-0 kubenswrapper[19170]: I0313 01:28:54.918251 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:28:54.918461 master-0 kubenswrapper[19170]: I0313 01:28:54.918429 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:28:54.920259 master-0 kubenswrapper[19170]: I0313 01:28:54.919091 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:28:54.920259 master-0 kubenswrapper[19170]: I0313 01:28:54.919197 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:28:56.489291 master-0 kubenswrapper[19170]: I0313 01:28:56.489244 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:28:59.421358 master-0 kubenswrapper[19170]: I0313 01:28:59.421243 19170 scope.go:117] "RemoveContainer" containerID="82755ddd8d0a56daa9b99df9ea0695a83707b2a72c5516360ce0e95bb8437a37" Mar 13 01:29:00.227520 master-0 kubenswrapper[19170]: I0313 01:29:00.227435 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-47sjr_8c6bf2d5-1881-4b63-b247-7e7426707fa1/cluster-baremetal-operator/2.log" Mar 13 01:29:00.228676 master-0 kubenswrapper[19170]: I0313 01:29:00.228026 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-47sjr" event={"ID":"8c6bf2d5-1881-4b63-b247-7e7426707fa1","Type":"ContainerStarted","Data":"4f3b4926f304e032d512ab13d9fde5de7f90fe564f9c73a16cc3b9753b479631"} Mar 13 01:29:00.420025 master-0 kubenswrapper[19170]: I0313 01:29:00.419934 19170 scope.go:117] "RemoveContainer" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" Mar 13 01:29:00.420385 master-0 kubenswrapper[19170]: E0313 01:29:00.420345 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:29:01.829851 master-0 kubenswrapper[19170]: I0313 01:29:01.829605 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:01.830750 master-0 kubenswrapper[19170]: I0313 01:29:01.829869 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:04.919697 master-0 kubenswrapper[19170]: I0313 01:29:04.919552 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:04.920510 master-0 kubenswrapper[19170]: I0313 01:29:04.919708 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:11.419374 master-0 kubenswrapper[19170]: I0313 01:29:11.419285 19170 scope.go:117] "RemoveContainer" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" Mar 13 01:29:11.420284 master-0 kubenswrapper[19170]: E0313 01:29:11.419672 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:29:11.828763 master-0 kubenswrapper[19170]: I0313 01:29:11.828563 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:11.829181 master-0 kubenswrapper[19170]: I0313 01:29:11.829118 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:14.919825 master-0 kubenswrapper[19170]: I0313 01:29:14.919765 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:14.920973 master-0 kubenswrapper[19170]: I0313 01:29:14.920871 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:21.828648 master-0 kubenswrapper[19170]: I0313 01:29:21.828579 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:21.829240 master-0 kubenswrapper[19170]: I0313 01:29:21.828745 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:24.419355 master-0 kubenswrapper[19170]: I0313 01:29:24.419278 19170 scope.go:117] "RemoveContainer" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" Mar 13 01:29:24.419906 master-0 kubenswrapper[19170]: E0313 01:29:24.419542 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-2slj5_openshift-cluster-storage-operator(3d2e7338-a6d6-4872-ab72-a4e631075ab3)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" podUID="3d2e7338-a6d6-4872-ab72-a4e631075ab3" Mar 13 01:29:24.919080 master-0 kubenswrapper[19170]: I0313 01:29:24.919001 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:24.919382 master-0 kubenswrapper[19170]: I0313 01:29:24.919116 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:28.412099 master-0 kubenswrapper[19170]: I0313 01:29:28.412054 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: E0313 01:29:28.412355 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" containerName="metrics-server" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412367 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" containerName="metrics-server" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: E0313 01:29:28.412381 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbab052-3cba-476d-a74f-edb7f738a73d" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412387 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbab052-3cba-476d-a74f-edb7f738a73d" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: E0313 01:29:28.412397 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" containerName="oauth-openshift" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412404 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" containerName="oauth-openshift" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: E0313 01:29:28.412418 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c9ec23-a5af-44a1-859a-86629153ae8c" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412423 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c9ec23-a5af-44a1-859a-86629153ae8c" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: E0313 01:29:28.412435 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349df320-5872-4352-8495-d7a5f9c4fc51" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412440 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="349df320-5872-4352-8495-d7a5f9c4fc51" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412559 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbbab052-3cba-476d-a74f-edb7f738a73d" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412577 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c9ec23-a5af-44a1-859a-86629153ae8c" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412600 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" containerName="oauth-openshift" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412613 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="349df320-5872-4352-8495-d7a5f9c4fc51" containerName="installer" Mar 13 01:29:28.412746 master-0 kubenswrapper[19170]: I0313 01:29:28.412625 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" containerName="metrics-server" Mar 13 01:29:28.413484 master-0 kubenswrapper[19170]: I0313 01:29:28.413257 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.418053 master-0 kubenswrapper[19170]: I0313 01:29:28.417999 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-k6d2j" Mar 13 01:29:28.418555 master-0 kubenswrapper[19170]: I0313 01:29:28.418529 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 01:29:28.438352 master-0 kubenswrapper[19170]: I0313 01:29:28.438293 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 01:29:28.555659 master-0 kubenswrapper[19170]: I0313 01:29:28.542290 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.555659 master-0 kubenswrapper[19170]: I0313 01:29:28.542416 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.555659 master-0 kubenswrapper[19170]: I0313 01:29:28.542500 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.614218 master-0 kubenswrapper[19170]: I0313 01:29:28.614173 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-5575f756f4-hqr5q"] Mar 13 01:29:28.617002 master-0 kubenswrapper[19170]: I0313 01:29:28.616955 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-5575f756f4-hqr5q"] Mar 13 01:29:28.643849 master-0 kubenswrapper[19170]: I0313 01:29:28.643777 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.643849 master-0 kubenswrapper[19170]: I0313 01:29:28.643857 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.644126 master-0 kubenswrapper[19170]: I0313 01:29:28.643899 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.644126 master-0 kubenswrapper[19170]: I0313 01:29:28.643998 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.644126 master-0 kubenswrapper[19170]: I0313 01:29:28.644033 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.655837 master-0 kubenswrapper[19170]: I0313 01:29:28.652930 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:29:28.661389 master-0 kubenswrapper[19170]: I0313 01:29:28.661333 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-db987b46b-l4pxc"] Mar 13 01:29:28.670816 master-0 kubenswrapper[19170]: I0313 01:29:28.670738 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access\") pod \"installer-5-master-0\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.734975 master-0 kubenswrapper[19170]: I0313 01:29:28.734112 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:29:28.827600 master-0 kubenswrapper[19170]: I0313 01:29:28.827531 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:29:28.833705 master-0 kubenswrapper[19170]: I0313 01:29:28.833288 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 01:29:29.172908 master-0 kubenswrapper[19170]: I0313 01:29:29.172848 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 01:29:29.427856 master-0 kubenswrapper[19170]: I0313 01:29:29.427700 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db888f0-51b6-43cf-8337-69d2d5cc2b0a" path="/var/lib/kubelet/pods/9db888f0-51b6-43cf-8337-69d2d5cc2b0a/volumes" Mar 13 01:29:29.428530 master-0 kubenswrapper[19170]: I0313 01:29:29.428187 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbbab052-3cba-476d-a74f-edb7f738a73d" path="/var/lib/kubelet/pods/cbbab052-3cba-476d-a74f-edb7f738a73d/volumes" Mar 13 01:29:29.428787 master-0 kubenswrapper[19170]: I0313 01:29:29.428741 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc96262c-7c20-490b-b90e-d1fba7a26a46" path="/var/lib/kubelet/pods/dc96262c-7c20-490b-b90e-d1fba7a26a46/volumes" Mar 13 01:29:29.510544 master-0 kubenswrapper[19170]: I0313 01:29:29.510462 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378","Type":"ContainerStarted","Data":"c4ff401e86392f18f9ed5dd6cd96e243c42fffb06ee381209c667c1e46be1107"} Mar 13 01:29:30.526243 master-0 kubenswrapper[19170]: I0313 01:29:30.526104 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378","Type":"ContainerStarted","Data":"4eff1c8e57bf0b7df6931dbb987129e526c39f422436961695f248f714210b27"} Mar 13 01:29:30.584302 master-0 kubenswrapper[19170]: I0313 01:29:30.584185 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=2.584167145 podStartE2EDuration="2.584167145s" podCreationTimestamp="2026-03-13 01:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:29:30.549668271 +0000 UTC m=+631.357789271" watchObservedRunningTime="2026-03-13 01:29:30.584167145 +0000 UTC m=+631.392288115" Mar 13 01:29:31.828593 master-0 kubenswrapper[19170]: I0313 01:29:31.828532 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:31.829220 master-0 kubenswrapper[19170]: I0313 01:29:31.828613 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:34.920001 master-0 kubenswrapper[19170]: I0313 01:29:34.919267 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:34.920001 master-0 kubenswrapper[19170]: I0313 01:29:34.919321 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:38.419962 master-0 kubenswrapper[19170]: I0313 01:29:38.419921 19170 scope.go:117] "RemoveContainer" containerID="70ca2dba74fde0b9437ddaecc14438da64a5e920ac39a8c37312cd2708424d13" Mar 13 01:29:39.611403 master-0 kubenswrapper[19170]: I0313 01:29:39.611350 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-2slj5_3d2e7338-a6d6-4872-ab72-a4e631075ab3/snapshot-controller/5.log" Mar 13 01:29:39.611403 master-0 kubenswrapper[19170]: I0313 01:29:39.611413 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-2slj5" event={"ID":"3d2e7338-a6d6-4872-ab72-a4e631075ab3","Type":"ContainerStarted","Data":"6980e69eee6b7705bff3466218d577a00e85f83919ae9b2bf5fadfb75ac7e869"} Mar 13 01:29:41.829481 master-0 kubenswrapper[19170]: I0313 01:29:41.829333 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:41.829481 master-0 kubenswrapper[19170]: I0313 01:29:41.829433 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:43.926470 master-0 kubenswrapper[19170]: I0313 01:29:43.926411 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-1-master-0"] Mar 13 01:29:43.927333 master-0 kubenswrapper[19170]: I0313 01:29:43.927303 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:43.929086 master-0 kubenswrapper[19170]: I0313 01:29:43.929039 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:29:43.931113 master-0 kubenswrapper[19170]: I0313 01:29:43.931082 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:29:43.941078 master-0 kubenswrapper[19170]: I0313 01:29:43.941031 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-1-master-0"] Mar 13 01:29:44.020737 master-0 kubenswrapper[19170]: I0313 01:29:44.019926 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.020737 master-0 kubenswrapper[19170]: I0313 01:29:44.019988 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.020737 master-0 kubenswrapper[19170]: I0313 01:29:44.020029 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.122131 master-0 kubenswrapper[19170]: I0313 01:29:44.122040 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.122328 master-0 kubenswrapper[19170]: I0313 01:29:44.122183 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.122328 master-0 kubenswrapper[19170]: I0313 01:29:44.122292 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.122465 master-0 kubenswrapper[19170]: I0313 01:29:44.122428 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.122615 master-0 kubenswrapper[19170]: I0313 01:29:44.122578 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.137654 master-0 kubenswrapper[19170]: I0313 01:29:44.137572 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.279307 master-0 kubenswrapper[19170]: I0313 01:29:44.279101 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:29:44.806457 master-0 kubenswrapper[19170]: I0313 01:29:44.806392 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-1-master-0"] Mar 13 01:29:44.810025 master-0 kubenswrapper[19170]: W0313 01:29:44.809952 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc4182669_e77a_4513_b19f_3b4bb618162e.slice/crio-a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b WatchSource:0}: Error finding container a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b: Status 404 returned error can't find the container with id a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b Mar 13 01:29:44.919768 master-0 kubenswrapper[19170]: I0313 01:29:44.919616 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:44.919768 master-0 kubenswrapper[19170]: I0313 01:29:44.919754 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:45.672939 master-0 kubenswrapper[19170]: I0313 01:29:45.672881 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" event={"ID":"c4182669-e77a-4513-b19f-3b4bb618162e","Type":"ContainerStarted","Data":"3b061f3f1e787eb1b0fb324ebe42a03b31a399e62cb1927e7607589175a485b1"} Mar 13 01:29:45.673596 master-0 kubenswrapper[19170]: I0313 01:29:45.673570 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" event={"ID":"c4182669-e77a-4513-b19f-3b4bb618162e","Type":"ContainerStarted","Data":"a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b"} Mar 13 01:29:45.702098 master-0 kubenswrapper[19170]: I0313 01:29:45.701981 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" podStartSLOduration=2.701951799 podStartE2EDuration="2.701951799s" podCreationTimestamp="2026-03-13 01:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:29:45.693156163 +0000 UTC m=+646.501277163" watchObservedRunningTime="2026-03-13 01:29:45.701951799 +0000 UTC m=+646.510072799" Mar 13 01:29:49.423964 master-0 kubenswrapper[19170]: I0313 01:29:49.423732 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:29:49.423964 master-0 kubenswrapper[19170]: I0313 01:29:49.423956 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:29:49.453650 master-0 kubenswrapper[19170]: I0313 01:29:49.450034 19170 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 13 01:29:49.475878 master-0 kubenswrapper[19170]: I0313 01:29:49.475777 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:29:49.485712 master-0 kubenswrapper[19170]: I0313 01:29:49.482017 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:29:49.486591 master-0 kubenswrapper[19170]: I0313 01:29:49.486538 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 01:29:49.709245 master-0 kubenswrapper[19170]: I0313 01:29:49.709096 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:29:49.709245 master-0 kubenswrapper[19170]: I0313 01:29:49.709131 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="b7991d30-9ff3-425a-86fa-693c9a353b29" Mar 13 01:29:51.829192 master-0 kubenswrapper[19170]: I0313 01:29:51.829119 19170 patch_prober.go:28] interesting pod/console-864f84b8db-z7bgh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Mar 13 01:29:51.829810 master-0 kubenswrapper[19170]: I0313 01:29:51.829200 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Mar 13 01:29:54.919482 master-0 kubenswrapper[19170]: I0313 01:29:54.919395 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:29:54.920491 master-0 kubenswrapper[19170]: I0313 01:29:54.919508 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:29:55.135733 master-0 kubenswrapper[19170]: I0313 01:29:55.135579 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-864f84b8db-z7bgh"] Mar 13 01:29:55.197699 master-0 kubenswrapper[19170]: I0313 01:29:55.197523 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=6.197504699 podStartE2EDuration="6.197504699s" podCreationTimestamp="2026-03-13 01:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:29:55.189882396 +0000 UTC m=+655.998003376" watchObservedRunningTime="2026-03-13 01:29:55.197504699 +0000 UTC m=+656.005625669" Mar 13 01:30:04.919938 master-0 kubenswrapper[19170]: I0313 01:30:04.919756 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:04.919938 master-0 kubenswrapper[19170]: I0313 01:30:04.919896 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:07.539255 master-0 kubenswrapper[19170]: I0313 01:30:07.539180 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:30:07.540668 master-0 kubenswrapper[19170]: I0313 01:30:07.540605 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:30:07.541102 master-0 kubenswrapper[19170]: I0313 01:30:07.541039 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" containerID="cri-o://0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199" gracePeriod=15 Mar 13 01:30:07.541270 master-0 kubenswrapper[19170]: I0313 01:30:07.541229 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.542920 master-0 kubenswrapper[19170]: I0313 01:30:07.541693 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" containerID="cri-o://6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c" gracePeriod=15 Mar 13 01:30:07.542920 master-0 kubenswrapper[19170]: I0313 01:30:07.541862 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a" gracePeriod=15 Mar 13 01:30:07.542920 master-0 kubenswrapper[19170]: I0313 01:30:07.541954 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b" gracePeriod=15 Mar 13 01:30:07.542920 master-0 kubenswrapper[19170]: I0313 01:30:07.542024 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" containerID="cri-o://afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7" gracePeriod=15 Mar 13 01:30:07.542920 master-0 kubenswrapper[19170]: I0313 01:30:07.542342 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.542962 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.542997 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.543029 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543048 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.543087 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543105 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.543149 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543169 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.543192 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543211 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: E0313 01:30:07.543244 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543263 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543677 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543723 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543749 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543809 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 01:30:07.544494 master-0 kubenswrapper[19170]: I0313 01:30:07.543842 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 01:30:07.658373 master-0 kubenswrapper[19170]: I0313 01:30:07.658288 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669140 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669227 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669273 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669346 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669429 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669493 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.670336 master-0 kubenswrapper[19170]: I0313 01:30:07.669584 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.773168 master-0 kubenswrapper[19170]: I0313 01:30:07.773116 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773309 master-0 kubenswrapper[19170]: I0313 01:30:07.773177 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773309 master-0 kubenswrapper[19170]: I0313 01:30:07.773205 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.773377 master-0 kubenswrapper[19170]: I0313 01:30:07.773305 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773479 master-0 kubenswrapper[19170]: I0313 01:30:07.773449 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.773518 master-0 kubenswrapper[19170]: I0313 01:30:07.773451 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773518 master-0 kubenswrapper[19170]: I0313 01:30:07.773505 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773590 master-0 kubenswrapper[19170]: I0313 01:30:07.773544 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773590 master-0 kubenswrapper[19170]: I0313 01:30:07.773563 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773692 master-0 kubenswrapper[19170]: I0313 01:30:07.773602 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773692 master-0 kubenswrapper[19170]: I0313 01:30:07.773666 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.773765 master-0 kubenswrapper[19170]: I0313 01:30:07.773683 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773765 master-0 kubenswrapper[19170]: I0313 01:30:07.773736 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.773834 master-0 kubenswrapper[19170]: I0313 01:30:07.773780 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.773881 master-0 kubenswrapper[19170]: I0313 01:30:07.773850 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.774088 master-0 kubenswrapper[19170]: I0313 01:30:07.774041 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:07.791435 master-0 kubenswrapper[19170]: E0313 01:30:07.791310 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.791984 master-0 kubenswrapper[19170]: I0313 01:30:07.791907 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:07.821561 master-0 kubenswrapper[19170]: W0313 01:30:07.821517 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda814bd60de133d95cf99630a978c017e.slice/crio-8f76f351eca32de6058dd49a4e75e8b7047f6d816f6d45d41717ab074610151b WatchSource:0}: Error finding container 8f76f351eca32de6058dd49a4e75e8b7047f6d816f6d45d41717ab074610151b: Status 404 returned error can't find the container with id 8f76f351eca32de6058dd49a4e75e8b7047f6d816f6d45d41717ab074610151b Mar 13 01:30:07.824538 master-0 kubenswrapper[19170]: E0313 01:30:07.824389 19170 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c42715649ac3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:30:07.823539261 +0000 UTC m=+668.631660241,LastTimestamp:2026-03-13 01:30:07.823539261 +0000 UTC m=+668.631660241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:30:07.878801 master-0 kubenswrapper[19170]: I0313 01:30:07.878759 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 01:30:07.879538 master-0 kubenswrapper[19170]: I0313 01:30:07.879505 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c" exitCode=0 Mar 13 01:30:07.879538 master-0 kubenswrapper[19170]: I0313 01:30:07.879535 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a" exitCode=0 Mar 13 01:30:07.879645 master-0 kubenswrapper[19170]: I0313 01:30:07.879546 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b" exitCode=0 Mar 13 01:30:07.879645 master-0 kubenswrapper[19170]: I0313 01:30:07.879556 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7" exitCode=2 Mar 13 01:30:07.881273 master-0 kubenswrapper[19170]: I0313 01:30:07.881209 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"8f76f351eca32de6058dd49a4e75e8b7047f6d816f6d45d41717ab074610151b"} Mar 13 01:30:07.883199 master-0 kubenswrapper[19170]: I0313 01:30:07.883146 19170 generic.go:334] "Generic (PLEG): container finished" podID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" containerID="4eff1c8e57bf0b7df6931dbb987129e526c39f422436961695f248f714210b27" exitCode=0 Mar 13 01:30:07.883259 master-0 kubenswrapper[19170]: I0313 01:30:07.883199 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378","Type":"ContainerDied","Data":"4eff1c8e57bf0b7df6931dbb987129e526c39f422436961695f248f714210b27"} Mar 13 01:30:07.884999 master-0 kubenswrapper[19170]: I0313 01:30:07.884958 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:07.886010 master-0 kubenswrapper[19170]: I0313 01:30:07.885944 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:08.896799 master-0 kubenswrapper[19170]: I0313 01:30:08.896693 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7"} Mar 13 01:30:08.898389 master-0 kubenswrapper[19170]: E0313 01:30:08.898260 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:08.898389 master-0 kubenswrapper[19170]: I0313 01:30:08.898356 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:08.899438 master-0 kubenswrapper[19170]: I0313 01:30:08.899366 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:09.423872 master-0 kubenswrapper[19170]: I0313 01:30:09.423816 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:30:09.428717 master-0 kubenswrapper[19170]: I0313 01:30:09.428463 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:09.429788 master-0 kubenswrapper[19170]: I0313 01:30:09.429202 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:09.430084 master-0 kubenswrapper[19170]: I0313 01:30:09.429885 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:09.430313 master-0 kubenswrapper[19170]: I0313 01:30:09.430284 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:09.502499 master-0 kubenswrapper[19170]: I0313 01:30:09.502444 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir\") pod \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " Mar 13 01:30:09.502960 master-0 kubenswrapper[19170]: I0313 01:30:09.502875 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" (UID: "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:09.502960 master-0 kubenswrapper[19170]: I0313 01:30:09.502910 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access\") pod \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " Mar 13 01:30:09.503139 master-0 kubenswrapper[19170]: I0313 01:30:09.503097 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock\") pod \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\" (UID: \"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378\") " Mar 13 01:30:09.503394 master-0 kubenswrapper[19170]: I0313 01:30:09.503338 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock" (OuterVolumeSpecName: "var-lock") pod "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" (UID: "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:09.504012 master-0 kubenswrapper[19170]: I0313 01:30:09.503971 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:09.504012 master-0 kubenswrapper[19170]: I0313 01:30:09.504003 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:09.509092 master-0 kubenswrapper[19170]: I0313 01:30:09.509036 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" (UID: "75fcfe7a-43bd-4fc6-98f2-c04bd2db4378"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:30:09.606303 master-0 kubenswrapper[19170]: I0313 01:30:09.606132 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75fcfe7a-43bd-4fc6-98f2-c04bd2db4378-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:09.905751 master-0 kubenswrapper[19170]: I0313 01:30:09.905702 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"75fcfe7a-43bd-4fc6-98f2-c04bd2db4378","Type":"ContainerDied","Data":"c4ff401e86392f18f9ed5dd6cd96e243c42fffb06ee381209c667c1e46be1107"} Mar 13 01:30:09.905751 master-0 kubenswrapper[19170]: I0313 01:30:09.905742 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4ff401e86392f18f9ed5dd6cd96e243c42fffb06ee381209c667c1e46be1107" Mar 13 01:30:09.906278 master-0 kubenswrapper[19170]: I0313 01:30:09.905795 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 01:30:09.906587 master-0 kubenswrapper[19170]: E0313 01:30:09.906545 19170 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:09.961349 master-0 kubenswrapper[19170]: I0313 01:30:09.961295 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:10.059575 master-0 kubenswrapper[19170]: I0313 01:30:10.059477 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 01:30:10.060931 master-0 kubenswrapper[19170]: I0313 01:30:10.060891 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:10.062107 master-0 kubenswrapper[19170]: I0313 01:30:10.062054 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:10.062939 master-0 kubenswrapper[19170]: I0313 01:30:10.062864 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:10.113891 master-0 kubenswrapper[19170]: I0313 01:30:10.113841 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 01:30:10.114041 master-0 kubenswrapper[19170]: I0313 01:30:10.113895 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 01:30:10.114041 master-0 kubenswrapper[19170]: I0313 01:30:10.114007 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 01:30:10.114169 master-0 kubenswrapper[19170]: I0313 01:30:10.114079 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:10.114169 master-0 kubenswrapper[19170]: I0313 01:30:10.114116 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:10.114433 master-0 kubenswrapper[19170]: I0313 01:30:10.114172 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:10.114880 master-0 kubenswrapper[19170]: I0313 01:30:10.114824 19170 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:10.114970 master-0 kubenswrapper[19170]: I0313 01:30:10.114881 19170 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:10.114970 master-0 kubenswrapper[19170]: I0313 01:30:10.114912 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:10.930300 master-0 kubenswrapper[19170]: I0313 01:30:10.930238 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 01:30:10.932265 master-0 kubenswrapper[19170]: I0313 01:30:10.932182 19170 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199" exitCode=0 Mar 13 01:30:10.932422 master-0 kubenswrapper[19170]: I0313 01:30:10.932299 19170 scope.go:117] "RemoveContainer" containerID="6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c" Mar 13 01:30:10.932422 master-0 kubenswrapper[19170]: I0313 01:30:10.932316 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:10.966342 master-0 kubenswrapper[19170]: I0313 01:30:10.965963 19170 scope.go:117] "RemoveContainer" containerID="efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a" Mar 13 01:30:10.970343 master-0 kubenswrapper[19170]: I0313 01:30:10.970276 19170 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:10.971468 master-0 kubenswrapper[19170]: I0313 01:30:10.971424 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:10.999246 master-0 kubenswrapper[19170]: I0313 01:30:10.999183 19170 scope.go:117] "RemoveContainer" containerID="ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b" Mar 13 01:30:11.028020 master-0 kubenswrapper[19170]: I0313 01:30:11.027954 19170 scope.go:117] "RemoveContainer" containerID="afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7" Mar 13 01:30:11.048124 master-0 kubenswrapper[19170]: I0313 01:30:11.048057 19170 scope.go:117] "RemoveContainer" containerID="0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199" Mar 13 01:30:11.070276 master-0 kubenswrapper[19170]: I0313 01:30:11.070153 19170 scope.go:117] "RemoveContainer" containerID="309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa" Mar 13 01:30:11.092902 master-0 kubenswrapper[19170]: I0313 01:30:11.092506 19170 scope.go:117] "RemoveContainer" containerID="6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c" Mar 13 01:30:11.093203 master-0 kubenswrapper[19170]: E0313 01:30:11.093163 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c\": container with ID starting with 6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c not found: ID does not exist" containerID="6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c" Mar 13 01:30:11.093272 master-0 kubenswrapper[19170]: I0313 01:30:11.093202 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c"} err="failed to get container status \"6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c\": rpc error: code = NotFound desc = could not find container \"6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c\": container with ID starting with 6a4d890dbbfc910000a615b5523b09e007462310e9f635c66ef6a4133c865e3c not found: ID does not exist" Mar 13 01:30:11.093272 master-0 kubenswrapper[19170]: I0313 01:30:11.093232 19170 scope.go:117] "RemoveContainer" containerID="efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a" Mar 13 01:30:11.093667 master-0 kubenswrapper[19170]: E0313 01:30:11.093613 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a\": container with ID starting with efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a not found: ID does not exist" containerID="efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a" Mar 13 01:30:11.093737 master-0 kubenswrapper[19170]: I0313 01:30:11.093662 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a"} err="failed to get container status \"efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a\": rpc error: code = NotFound desc = could not find container \"efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a\": container with ID starting with efdeab99b2245e9d6b33e3c3a1ab63b3ecef2af6627c0f17296a93c127af552a not found: ID does not exist" Mar 13 01:30:11.093737 master-0 kubenswrapper[19170]: I0313 01:30:11.093688 19170 scope.go:117] "RemoveContainer" containerID="ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b" Mar 13 01:30:11.094257 master-0 kubenswrapper[19170]: E0313 01:30:11.094228 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b\": container with ID starting with ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b not found: ID does not exist" containerID="ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b" Mar 13 01:30:11.094335 master-0 kubenswrapper[19170]: I0313 01:30:11.094261 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b"} err="failed to get container status \"ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b\": rpc error: code = NotFound desc = could not find container \"ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b\": container with ID starting with ffb771f0d26cb88c120c142068a509a53a6792018a1af856aac41ccbd68ce39b not found: ID does not exist" Mar 13 01:30:11.094335 master-0 kubenswrapper[19170]: I0313 01:30:11.094281 19170 scope.go:117] "RemoveContainer" containerID="afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7" Mar 13 01:30:11.094686 master-0 kubenswrapper[19170]: E0313 01:30:11.094616 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7\": container with ID starting with afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7 not found: ID does not exist" containerID="afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7" Mar 13 01:30:11.094754 master-0 kubenswrapper[19170]: I0313 01:30:11.094688 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7"} err="failed to get container status \"afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7\": rpc error: code = NotFound desc = could not find container \"afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7\": container with ID starting with afea8e1223cab11ef744a0add0ec1f18a6c8de0e98e1a3b36b5f58d62c9b92f7 not found: ID does not exist" Mar 13 01:30:11.094754 master-0 kubenswrapper[19170]: I0313 01:30:11.094735 19170 scope.go:117] "RemoveContainer" containerID="0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199" Mar 13 01:30:11.095177 master-0 kubenswrapper[19170]: E0313 01:30:11.095147 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199\": container with ID starting with 0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199 not found: ID does not exist" containerID="0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199" Mar 13 01:30:11.095236 master-0 kubenswrapper[19170]: I0313 01:30:11.095176 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199"} err="failed to get container status \"0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199\": rpc error: code = NotFound desc = could not find container \"0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199\": container with ID starting with 0d8e29f7be01ebf5d7b5d718b370b03910c6c8753aa2db240e0eb704aec61199 not found: ID does not exist" Mar 13 01:30:11.095236 master-0 kubenswrapper[19170]: I0313 01:30:11.095194 19170 scope.go:117] "RemoveContainer" containerID="309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa" Mar 13 01:30:11.095496 master-0 kubenswrapper[19170]: E0313 01:30:11.095436 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa\": container with ID starting with 309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa not found: ID does not exist" containerID="309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa" Mar 13 01:30:11.095569 master-0 kubenswrapper[19170]: I0313 01:30:11.095494 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa"} err="failed to get container status \"309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa\": rpc error: code = NotFound desc = could not find container \"309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa\": container with ID starting with 309294b76ccf30a6ad7117625866b5cda39484ec8cec95dbb511ad0d3d49a3fa not found: ID does not exist" Mar 13 01:30:11.432964 master-0 kubenswrapper[19170]: I0313 01:30:11.432911 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dd10388b9e3e48a07382126e86621" path="/var/lib/kubelet/pods/077dd10388b9e3e48a07382126e86621/volumes" Mar 13 01:30:11.703556 master-0 kubenswrapper[19170]: E0313 01:30:11.703379 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:11.704702 master-0 kubenswrapper[19170]: E0313 01:30:11.704579 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:11.705743 master-0 kubenswrapper[19170]: E0313 01:30:11.705670 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:11.706494 master-0 kubenswrapper[19170]: E0313 01:30:11.706425 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:11.707316 master-0 kubenswrapper[19170]: E0313 01:30:11.707252 19170 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:11.707316 master-0 kubenswrapper[19170]: I0313 01:30:11.707301 19170 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 01:30:11.708318 master-0 kubenswrapper[19170]: E0313 01:30:11.708246 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 01:30:11.910163 master-0 kubenswrapper[19170]: E0313 01:30:11.910045 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 01:30:12.311535 master-0 kubenswrapper[19170]: E0313 01:30:12.311436 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 01:30:13.113231 master-0 kubenswrapper[19170]: E0313 01:30:13.113144 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 01:30:13.215201 master-0 kubenswrapper[19170]: E0313 01:30:13.214974 19170 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c42715649ac3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 01:30:07.823539261 +0000 UTC m=+668.631660241,LastTimestamp:2026-03-13 01:30:07.823539261 +0000 UTC m=+668.631660241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 01:30:14.715598 master-0 kubenswrapper[19170]: E0313 01:30:14.715493 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 01:30:14.919871 master-0 kubenswrapper[19170]: I0313 01:30:14.919790 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:14.919871 master-0 kubenswrapper[19170]: I0313 01:30:14.919870 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:16.999573 master-0 kubenswrapper[19170]: I0313 01:30:16.999536 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-retry-1-master-0_c4182669-e77a-4513-b19f-3b4bb618162e/installer/0.log" Mar 13 01:30:17.000269 master-0 kubenswrapper[19170]: I0313 01:30:17.000240 19170 generic.go:334] "Generic (PLEG): container finished" podID="c4182669-e77a-4513-b19f-3b4bb618162e" containerID="3b061f3f1e787eb1b0fb324ebe42a03b31a399e62cb1927e7607589175a485b1" exitCode=1 Mar 13 01:30:17.000367 master-0 kubenswrapper[19170]: I0313 01:30:17.000302 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" event={"ID":"c4182669-e77a-4513-b19f-3b4bb618162e","Type":"ContainerDied","Data":"3b061f3f1e787eb1b0fb324ebe42a03b31a399e62cb1927e7607589175a485b1"} Mar 13 01:30:17.001750 master-0 kubenswrapper[19170]: I0313 01:30:17.001569 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:17.003061 master-0 kubenswrapper[19170]: I0313 01:30:17.002986 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:17.917374 master-0 kubenswrapper[19170]: E0313 01:30:17.917208 19170 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 13 01:30:18.425547 master-0 kubenswrapper[19170]: I0313 01:30:18.425497 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-retry-1-master-0_c4182669-e77a-4513-b19f-3b4bb618162e/installer/0.log" Mar 13 01:30:18.426063 master-0 kubenswrapper[19170]: I0313 01:30:18.425582 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:30:18.426437 master-0 kubenswrapper[19170]: I0313 01:30:18.426386 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:18.427147 master-0 kubenswrapper[19170]: I0313 01:30:18.427102 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:18.456097 master-0 kubenswrapper[19170]: I0313 01:30:18.456032 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir\") pod \"c4182669-e77a-4513-b19f-3b4bb618162e\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " Mar 13 01:30:18.456097 master-0 kubenswrapper[19170]: I0313 01:30:18.456097 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access\") pod \"c4182669-e77a-4513-b19f-3b4bb618162e\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " Mar 13 01:30:18.456245 master-0 kubenswrapper[19170]: I0313 01:30:18.456138 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock\") pod \"c4182669-e77a-4513-b19f-3b4bb618162e\" (UID: \"c4182669-e77a-4513-b19f-3b4bb618162e\") " Mar 13 01:30:18.456573 master-0 kubenswrapper[19170]: I0313 01:30:18.456492 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c4182669-e77a-4513-b19f-3b4bb618162e" (UID: "c4182669-e77a-4513-b19f-3b4bb618162e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:18.456573 master-0 kubenswrapper[19170]: I0313 01:30:18.456506 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock" (OuterVolumeSpecName: "var-lock") pod "c4182669-e77a-4513-b19f-3b4bb618162e" (UID: "c4182669-e77a-4513-b19f-3b4bb618162e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:18.460406 master-0 kubenswrapper[19170]: I0313 01:30:18.460362 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c4182669-e77a-4513-b19f-3b4bb618162e" (UID: "c4182669-e77a-4513-b19f-3b4bb618162e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:30:18.558528 master-0 kubenswrapper[19170]: I0313 01:30:18.558392 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:18.558528 master-0 kubenswrapper[19170]: I0313 01:30:18.558484 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4182669-e77a-4513-b19f-3b4bb618162e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:18.558937 master-0 kubenswrapper[19170]: I0313 01:30:18.558572 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c4182669-e77a-4513-b19f-3b4bb618162e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:19.038119 master-0 kubenswrapper[19170]: I0313 01:30:19.038075 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-5-retry-1-master-0_c4182669-e77a-4513-b19f-3b4bb618162e/installer/0.log" Mar 13 01:30:19.038579 master-0 kubenswrapper[19170]: I0313 01:30:19.038198 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" event={"ID":"c4182669-e77a-4513-b19f-3b4bb618162e","Type":"ContainerDied","Data":"a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b"} Mar 13 01:30:19.038579 master-0 kubenswrapper[19170]: I0313 01:30:19.038263 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a80ddc3f35c7bd562980963a51a8a644b63ee996db08d7cd59c2282d21b50f9b" Mar 13 01:30:19.038579 master-0 kubenswrapper[19170]: I0313 01:30:19.038331 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" Mar 13 01:30:19.057302 master-0 kubenswrapper[19170]: I0313 01:30:19.057218 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.057965 master-0 kubenswrapper[19170]: I0313 01:30:19.057911 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.418890 master-0 kubenswrapper[19170]: I0313 01:30:19.418825 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:19.427162 master-0 kubenswrapper[19170]: I0313 01:30:19.427057 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.428909 master-0 kubenswrapper[19170]: I0313 01:30:19.428853 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.431300 master-0 kubenswrapper[19170]: I0313 01:30:19.431234 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.432700 master-0 kubenswrapper[19170]: I0313 01:30:19.432554 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:19.460035 master-0 kubenswrapper[19170]: I0313 01:30:19.459964 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:19.460035 master-0 kubenswrapper[19170]: I0313 01:30:19.460030 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:19.461351 master-0 kubenswrapper[19170]: E0313 01:30:19.461273 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:19.462090 master-0 kubenswrapper[19170]: I0313 01:30:19.462041 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:19.496953 master-0 kubenswrapper[19170]: W0313 01:30:19.496890 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d4251d3504cdc0ec85144c1379056c.slice/crio-9c7a5760016be50e4d0559b59ca3eaa6f9fab1bd16b86aa8119c31a59e4b46a5 WatchSource:0}: Error finding container 9c7a5760016be50e4d0559b59ca3eaa6f9fab1bd16b86aa8119c31a59e4b46a5: Status 404 returned error can't find the container with id 9c7a5760016be50e4d0559b59ca3eaa6f9fab1bd16b86aa8119c31a59e4b46a5 Mar 13 01:30:20.050349 master-0 kubenswrapper[19170]: I0313 01:30:20.050189 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"abcc00d8af9a9743dcb7e6e0a3fc41a3ae7a5637d3a39f7c796d2c02690cb018"} Mar 13 01:30:20.050727 master-0 kubenswrapper[19170]: I0313 01:30:20.050686 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:20.050727 master-0 kubenswrapper[19170]: I0313 01:30:20.050721 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:20.050993 master-0 kubenswrapper[19170]: I0313 01:30:20.050796 19170 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="abcc00d8af9a9743dcb7e6e0a3fc41a3ae7a5637d3a39f7c796d2c02690cb018" exitCode=0 Mar 13 01:30:20.051577 master-0 kubenswrapper[19170]: E0313 01:30:20.051514 19170 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:20.051577 master-0 kubenswrapper[19170]: I0313 01:30:20.051534 19170 status_manager.go:851] "Failed to get status for pod" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:20.051971 master-0 kubenswrapper[19170]: I0313 01:30:20.051120 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"9c7a5760016be50e4d0559b59ca3eaa6f9fab1bd16b86aa8119c31a59e4b46a5"} Mar 13 01:30:20.052470 master-0 kubenswrapper[19170]: I0313 01:30:20.052391 19170 status_manager.go:851] "Failed to get status for pod" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" pod="openshift-kube-controller-manager/installer-5-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-5-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 01:30:20.197187 master-0 kubenswrapper[19170]: I0313 01:30:20.197070 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-864f84b8db-z7bgh" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" containerID="cri-o://0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f" gracePeriod=15 Mar 13 01:30:20.816183 master-0 kubenswrapper[19170]: I0313 01:30:20.816125 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-864f84b8db-z7bgh_cdc3c693-5b70-44ef-b53d-7a546edd268c/console/1.log" Mar 13 01:30:20.817514 master-0 kubenswrapper[19170]: I0313 01:30:20.817471 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-864f84b8db-z7bgh_cdc3c693-5b70-44ef-b53d-7a546edd268c/console/0.log" Mar 13 01:30:20.817595 master-0 kubenswrapper[19170]: I0313 01:30:20.817561 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:30:21.001830 master-0 kubenswrapper[19170]: I0313 01:30:21.001765 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002116 master-0 kubenswrapper[19170]: I0313 01:30:21.001869 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002116 master-0 kubenswrapper[19170]: I0313 01:30:21.002013 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002116 master-0 kubenswrapper[19170]: I0313 01:30:21.002085 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002375 master-0 kubenswrapper[19170]: I0313 01:30:21.002137 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002375 master-0 kubenswrapper[19170]: I0313 01:30:21.002205 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6mxx\" (UniqueName: \"kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002375 master-0 kubenswrapper[19170]: I0313 01:30:21.002273 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config\") pod \"cdc3c693-5b70-44ef-b53d-7a546edd268c\" (UID: \"cdc3c693-5b70-44ef-b53d-7a546edd268c\") " Mar 13 01:30:21.002605 master-0 kubenswrapper[19170]: I0313 01:30:21.002418 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:30:21.002808 master-0 kubenswrapper[19170]: I0313 01:30:21.002738 19170 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.003024 master-0 kubenswrapper[19170]: I0313 01:30:21.002985 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca" (OuterVolumeSpecName: "service-ca") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:30:21.003469 master-0 kubenswrapper[19170]: I0313 01:30:21.003405 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config" (OuterVolumeSpecName: "console-config") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:30:21.003469 master-0 kubenswrapper[19170]: I0313 01:30:21.003408 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:30:21.007055 master-0 kubenswrapper[19170]: I0313 01:30:21.006995 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:30:21.007354 master-0 kubenswrapper[19170]: I0313 01:30:21.007070 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx" (OuterVolumeSpecName: "kube-api-access-r6mxx") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "kube-api-access-r6mxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:30:21.010855 master-0 kubenswrapper[19170]: I0313 01:30:21.010794 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cdc3c693-5b70-44ef-b53d-7a546edd268c" (UID: "cdc3c693-5b70-44ef-b53d-7a546edd268c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:30:21.066209 master-0 kubenswrapper[19170]: I0313 01:30:21.066136 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"e1a2292a7a10d553b9504c6b508b1fb8000b7701e833fe25479cbac27ffe7518"} Mar 13 01:30:21.066392 master-0 kubenswrapper[19170]: I0313 01:30:21.066209 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"f635b84fd2e0a9b51e24757243203650650cd2fc59802d5c75ea53e555663fe3"} Mar 13 01:30:21.072073 master-0 kubenswrapper[19170]: I0313 01:30:21.072023 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-864f84b8db-z7bgh_cdc3c693-5b70-44ef-b53d-7a546edd268c/console/1.log" Mar 13 01:30:21.073710 master-0 kubenswrapper[19170]: I0313 01:30:21.073675 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-864f84b8db-z7bgh_cdc3c693-5b70-44ef-b53d-7a546edd268c/console/0.log" Mar 13 01:30:21.073952 master-0 kubenswrapper[19170]: I0313 01:30:21.073917 19170 generic.go:334] "Generic (PLEG): container finished" podID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerID="0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f" exitCode=2 Mar 13 01:30:21.074105 master-0 kubenswrapper[19170]: I0313 01:30:21.074002 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-864f84b8db-z7bgh" Mar 13 01:30:21.074394 master-0 kubenswrapper[19170]: I0313 01:30:21.074013 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerDied","Data":"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f"} Mar 13 01:30:21.074518 master-0 kubenswrapper[19170]: I0313 01:30:21.074432 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-864f84b8db-z7bgh" event={"ID":"cdc3c693-5b70-44ef-b53d-7a546edd268c","Type":"ContainerDied","Data":"c02664bd4b8cec5a0f55c19a86a682b2128142272590fc43e989472c25afa98d"} Mar 13 01:30:21.074518 master-0 kubenswrapper[19170]: I0313 01:30:21.074478 19170 scope.go:117] "RemoveContainer" containerID="0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f" Mar 13 01:30:21.102795 master-0 kubenswrapper[19170]: I0313 01:30:21.102728 19170 scope.go:117] "RemoveContainer" containerID="a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" Mar 13 01:30:21.103979 master-0 kubenswrapper[19170]: I0313 01:30:21.103917 19170 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.103979 master-0 kubenswrapper[19170]: I0313 01:30:21.103971 19170 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.104250 master-0 kubenswrapper[19170]: I0313 01:30:21.103993 19170 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.104250 master-0 kubenswrapper[19170]: I0313 01:30:21.104013 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6mxx\" (UniqueName: \"kubernetes.io/projected/cdc3c693-5b70-44ef-b53d-7a546edd268c-kube-api-access-r6mxx\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.104250 master-0 kubenswrapper[19170]: I0313 01:30:21.104030 19170 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.104250 master-0 kubenswrapper[19170]: I0313 01:30:21.104048 19170 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdc3c693-5b70-44ef-b53d-7a546edd268c-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:21.148139 master-0 kubenswrapper[19170]: I0313 01:30:21.148083 19170 scope.go:117] "RemoveContainer" containerID="0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f" Mar 13 01:30:21.148853 master-0 kubenswrapper[19170]: E0313 01:30:21.148812 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f\": container with ID starting with 0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f not found: ID does not exist" containerID="0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f" Mar 13 01:30:21.148934 master-0 kubenswrapper[19170]: I0313 01:30:21.148860 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f"} err="failed to get container status \"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f\": rpc error: code = NotFound desc = could not find container \"0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f\": container with ID starting with 0c0f7d8cdd19d7855d3ce85ebbc190290edf109342080e100230975fea55438f not found: ID does not exist" Mar 13 01:30:21.148934 master-0 kubenswrapper[19170]: I0313 01:30:21.148895 19170 scope.go:117] "RemoveContainer" containerID="a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" Mar 13 01:30:21.150552 master-0 kubenswrapper[19170]: E0313 01:30:21.150499 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2\": container with ID starting with a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2 not found: ID does not exist" containerID="a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2" Mar 13 01:30:21.150675 master-0 kubenswrapper[19170]: I0313 01:30:21.150548 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2"} err="failed to get container status \"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2\": rpc error: code = NotFound desc = could not find container \"a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2\": container with ID starting with a9b418f438d2c9a6f0a7da4115cd11e67669dbf3f642959fccdba4b61048c8b2 not found: ID does not exist" Mar 13 01:30:22.104766 master-0 kubenswrapper[19170]: I0313 01:30:22.104691 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"b1fbfa30256733804f73852bbc2a900999b44a7ab8239a2de7b9bd6e12a5e22c"} Mar 13 01:30:22.104766 master-0 kubenswrapper[19170]: I0313 01:30:22.104742 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"dbfb750629281428073e8ae8a558011b13087f8e21a8a99f4986c7c00c3ba60f"} Mar 13 01:30:22.104766 master-0 kubenswrapper[19170]: I0313 01:30:22.104757 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"5a403c1cb2bf9030bae4e9f54037a191aa0bf29866fea2c70b396a808423c31e"} Mar 13 01:30:22.105728 master-0 kubenswrapper[19170]: I0313 01:30:22.105097 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:22.105728 master-0 kubenswrapper[19170]: I0313 01:30:22.105125 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:22.105728 master-0 kubenswrapper[19170]: I0313 01:30:22.105269 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:23.125747 master-0 kubenswrapper[19170]: I0313 01:30:23.124967 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/3.log" Mar 13 01:30:23.126568 master-0 kubenswrapper[19170]: I0313 01:30:23.126236 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:30:23.129354 master-0 kubenswrapper[19170]: I0313 01:30:23.127016 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/2.log" Mar 13 01:30:23.135615 master-0 kubenswrapper[19170]: I0313 01:30:23.134419 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" exitCode=1 Mar 13 01:30:23.135615 master-0 kubenswrapper[19170]: I0313 01:30:23.134460 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerDied","Data":"755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e"} Mar 13 01:30:23.135615 master-0 kubenswrapper[19170]: I0313 01:30:23.134501 19170 scope.go:117] "RemoveContainer" containerID="7ce47737a8d2d9676279ddb2a4db2557a0fbf6f6c9fa25284625c89fdd87d08d" Mar 13 01:30:23.135615 master-0 kubenswrapper[19170]: I0313 01:30:23.135119 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:30:23.135615 master-0 kubenswrapper[19170]: E0313 01:30:23.135434 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:30:24.148912 master-0 kubenswrapper[19170]: I0313 01:30:24.148845 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/3.log" Mar 13 01:30:24.150324 master-0 kubenswrapper[19170]: I0313 01:30:24.150273 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:30:24.462731 master-0 kubenswrapper[19170]: I0313 01:30:24.462663 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:24.463052 master-0 kubenswrapper[19170]: I0313 01:30:24.462864 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:24.471763 master-0 kubenswrapper[19170]: I0313 01:30:24.471659 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:24.919844 master-0 kubenswrapper[19170]: I0313 01:30:24.919767 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:24.920307 master-0 kubenswrapper[19170]: I0313 01:30:24.919873 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:26.483716 master-0 kubenswrapper[19170]: I0313 01:30:26.483661 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:26.484711 master-0 kubenswrapper[19170]: I0313 01:30:26.484672 19170 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:26.485214 master-0 kubenswrapper[19170]: I0313 01:30:26.485181 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:26.485527 master-0 kubenswrapper[19170]: I0313 01:30:26.485448 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:30:26.486135 master-0 kubenswrapper[19170]: E0313 01:30:26.486068 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:30:27.158669 master-0 kubenswrapper[19170]: I0313 01:30:27.158196 19170 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:27.188434 master-0 kubenswrapper[19170]: I0313 01:30:27.188373 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:30:27.189086 master-0 kubenswrapper[19170]: E0313 01:30:27.189014 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:30:27.242480 master-0 kubenswrapper[19170]: I0313 01:30:27.242367 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="ffe11483-024e-4c22-a052-5e890a263840" Mar 13 01:30:28.198149 master-0 kubenswrapper[19170]: I0313 01:30:28.198057 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:28.198149 master-0 kubenswrapper[19170]: I0313 01:30:28.198110 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:29.445251 master-0 kubenswrapper[19170]: I0313 01:30:29.445124 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="ffe11483-024e-4c22-a052-5e890a263840" Mar 13 01:30:34.919564 master-0 kubenswrapper[19170]: I0313 01:30:34.919449 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:34.920721 master-0 kubenswrapper[19170]: I0313 01:30:34.919561 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:37.159573 master-0 kubenswrapper[19170]: I0313 01:30:37.159501 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 01:30:37.770028 master-0 kubenswrapper[19170]: I0313 01:30:37.769975 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 01:30:37.907582 master-0 kubenswrapper[19170]: I0313 01:30:37.907522 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 01:30:38.416457 master-0 kubenswrapper[19170]: I0313 01:30:38.416391 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 01:30:38.522979 master-0 kubenswrapper[19170]: I0313 01:30:38.522884 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 01:30:38.535041 master-0 kubenswrapper[19170]: I0313 01:30:38.534983 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 01:30:38.654298 master-0 kubenswrapper[19170]: I0313 01:30:38.654217 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-6h5r7" Mar 13 01:30:38.861279 master-0 kubenswrapper[19170]: I0313 01:30:38.861161 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 01:30:38.871190 master-0 kubenswrapper[19170]: I0313 01:30:38.871168 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 01:30:39.056050 master-0 kubenswrapper[19170]: I0313 01:30:39.055982 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-7wmj8" Mar 13 01:30:39.140923 master-0 kubenswrapper[19170]: I0313 01:30:39.140782 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 01:30:39.182014 master-0 kubenswrapper[19170]: I0313 01:30:39.181949 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 01:30:39.246890 master-0 kubenswrapper[19170]: I0313 01:30:39.246617 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 01:30:39.323581 master-0 kubenswrapper[19170]: I0313 01:30:39.323526 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 01:30:39.438019 master-0 kubenswrapper[19170]: I0313 01:30:39.437930 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:30:39.438828 master-0 kubenswrapper[19170]: E0313 01:30:39.438467 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(43028f0e2cfc9ffb600b4d08ad84e12d)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" Mar 13 01:30:39.542021 master-0 kubenswrapper[19170]: I0313 01:30:39.541916 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 01:30:39.577043 master-0 kubenswrapper[19170]: I0313 01:30:39.576967 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 01:30:39.634102 master-0 kubenswrapper[19170]: I0313 01:30:39.632911 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 01:30:39.967972 master-0 kubenswrapper[19170]: I0313 01:30:39.967851 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 01:30:40.290770 master-0 kubenswrapper[19170]: I0313 01:30:40.287202 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 01:30:40.310581 master-0 kubenswrapper[19170]: I0313 01:30:40.310500 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 01:30:40.313753 master-0 kubenswrapper[19170]: I0313 01:30:40.313685 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 01:30:40.344853 master-0 kubenswrapper[19170]: I0313 01:30:40.344807 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 01:30:40.434762 master-0 kubenswrapper[19170]: I0313 01:30:40.434104 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 01:30:40.466180 master-0 kubenswrapper[19170]: I0313 01:30:40.466094 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 01:30:40.525686 master-0 kubenswrapper[19170]: I0313 01:30:40.525597 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 01:30:40.584533 master-0 kubenswrapper[19170]: I0313 01:30:40.584334 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:30:40.655163 master-0 kubenswrapper[19170]: I0313 01:30:40.655086 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 13 01:30:40.790563 master-0 kubenswrapper[19170]: I0313 01:30:40.790479 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 01:30:40.809433 master-0 kubenswrapper[19170]: I0313 01:30:40.809374 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 01:30:40.929693 master-0 kubenswrapper[19170]: I0313 01:30:40.928486 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 01:30:41.080511 master-0 kubenswrapper[19170]: I0313 01:30:41.080459 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 01:30:41.097876 master-0 kubenswrapper[19170]: I0313 01:30:41.097816 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 01:30:41.207695 master-0 kubenswrapper[19170]: I0313 01:30:41.207527 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 01:30:41.231854 master-0 kubenswrapper[19170]: I0313 01:30:41.231790 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 01:30:41.302600 master-0 kubenswrapper[19170]: I0313 01:30:41.302537 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 01:30:41.373874 master-0 kubenswrapper[19170]: I0313 01:30:41.373819 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 01:30:41.451930 master-0 kubenswrapper[19170]: I0313 01:30:41.451856 19170 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 01:30:41.479446 master-0 kubenswrapper[19170]: I0313 01:30:41.479293 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 01:30:41.555737 master-0 kubenswrapper[19170]: I0313 01:30:41.555471 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 01:30:41.607105 master-0 kubenswrapper[19170]: I0313 01:30:41.607049 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 01:30:41.766703 master-0 kubenswrapper[19170]: I0313 01:30:41.766560 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 01:30:41.822981 master-0 kubenswrapper[19170]: I0313 01:30:41.822912 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 01:30:41.897675 master-0 kubenswrapper[19170]: I0313 01:30:41.897507 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 01:30:41.953447 master-0 kubenswrapper[19170]: I0313 01:30:41.953342 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 01:30:41.993481 master-0 kubenswrapper[19170]: I0313 01:30:41.993380 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 01:30:42.048840 master-0 kubenswrapper[19170]: I0313 01:30:42.048705 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 01:30:42.065778 master-0 kubenswrapper[19170]: I0313 01:30:42.065720 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 01:30:42.153662 master-0 kubenswrapper[19170]: I0313 01:30:42.153001 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 01:30:42.187325 master-0 kubenswrapper[19170]: I0313 01:30:42.187265 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-s8vsn" Mar 13 01:30:42.208515 master-0 kubenswrapper[19170]: I0313 01:30:42.208461 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 01:30:42.294090 master-0 kubenswrapper[19170]: I0313 01:30:42.294019 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 01:30:42.321459 master-0 kubenswrapper[19170]: I0313 01:30:42.321313 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 01:30:42.326242 master-0 kubenswrapper[19170]: I0313 01:30:42.326200 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 01:30:42.392163 master-0 kubenswrapper[19170]: I0313 01:30:42.392116 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 01:30:42.392789 master-0 kubenswrapper[19170]: I0313 01:30:42.392733 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 01:30:42.425058 master-0 kubenswrapper[19170]: I0313 01:30:42.425010 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 01:30:42.687665 master-0 kubenswrapper[19170]: I0313 01:30:42.687571 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 01:30:42.688526 master-0 kubenswrapper[19170]: I0313 01:30:42.688230 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 01:30:42.725298 master-0 kubenswrapper[19170]: I0313 01:30:42.725167 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 13 01:30:42.725830 master-0 kubenswrapper[19170]: I0313 01:30:42.725718 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 01:30:42.747461 master-0 kubenswrapper[19170]: I0313 01:30:42.747366 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 01:30:42.835529 master-0 kubenswrapper[19170]: I0313 01:30:42.835441 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 01:30:42.992058 master-0 kubenswrapper[19170]: I0313 01:30:42.991918 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 01:30:43.004444 master-0 kubenswrapper[19170]: I0313 01:30:43.004398 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 01:30:43.006327 master-0 kubenswrapper[19170]: I0313 01:30:43.006271 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 01:30:43.182163 master-0 kubenswrapper[19170]: I0313 01:30:43.182107 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-cg3teed5h7t4o" Mar 13 01:30:43.190553 master-0 kubenswrapper[19170]: I0313 01:30:43.190502 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 01:30:43.201677 master-0 kubenswrapper[19170]: I0313 01:30:43.201594 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 01:30:43.230404 master-0 kubenswrapper[19170]: I0313 01:30:43.230054 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 01:30:43.253944 master-0 kubenswrapper[19170]: I0313 01:30:43.252840 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 01:30:43.390800 master-0 kubenswrapper[19170]: I0313 01:30:43.390680 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 01:30:43.394371 master-0 kubenswrapper[19170]: I0313 01:30:43.394322 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 01:30:43.420616 master-0 kubenswrapper[19170]: I0313 01:30:43.420560 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 01:30:43.660175 master-0 kubenswrapper[19170]: I0313 01:30:43.660021 19170 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 01:30:43.727540 master-0 kubenswrapper[19170]: I0313 01:30:43.727453 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 01:30:43.731123 master-0 kubenswrapper[19170]: I0313 01:30:43.731073 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 01:30:43.768174 master-0 kubenswrapper[19170]: I0313 01:30:43.768071 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-6b6t5" Mar 13 01:30:43.772885 master-0 kubenswrapper[19170]: I0313 01:30:43.772828 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 01:30:43.778984 master-0 kubenswrapper[19170]: I0313 01:30:43.778922 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 01:30:43.838156 master-0 kubenswrapper[19170]: I0313 01:30:43.838073 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 01:30:43.852541 master-0 kubenswrapper[19170]: I0313 01:30:43.852442 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 01:30:43.860316 master-0 kubenswrapper[19170]: I0313 01:30:43.858799 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 01:30:43.903676 master-0 kubenswrapper[19170]: I0313 01:30:43.903520 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-z62g7" Mar 13 01:30:43.968777 master-0 kubenswrapper[19170]: I0313 01:30:43.968712 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-qcwkf" Mar 13 01:30:44.083925 master-0 kubenswrapper[19170]: I0313 01:30:44.083850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-2pmf7" Mar 13 01:30:44.183489 master-0 kubenswrapper[19170]: I0313 01:30:44.183411 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-tkjm8" Mar 13 01:30:44.189108 master-0 kubenswrapper[19170]: I0313 01:30:44.189037 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 01:30:44.192326 master-0 kubenswrapper[19170]: I0313 01:30:44.192283 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.275349 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.297345 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.361924 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.413964 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.433747 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.479833 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.510547 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.512496 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 01:30:44.545966 master-0 kubenswrapper[19170]: I0313 01:30:44.522832 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 01:30:44.553997 master-0 kubenswrapper[19170]: I0313 01:30:44.553920 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 01:30:44.560813 master-0 kubenswrapper[19170]: I0313 01:30:44.559106 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 01:30:44.561627 master-0 kubenswrapper[19170]: I0313 01:30:44.561553 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 01:30:44.592869 master-0 kubenswrapper[19170]: I0313 01:30:44.591941 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 01:30:44.639267 master-0 kubenswrapper[19170]: I0313 01:30:44.639197 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 01:30:44.658806 master-0 kubenswrapper[19170]: I0313 01:30:44.658728 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:30:44.716743 master-0 kubenswrapper[19170]: I0313 01:30:44.716679 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 01:30:44.754929 master-0 kubenswrapper[19170]: I0313 01:30:44.754860 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 01:30:44.815560 master-0 kubenswrapper[19170]: I0313 01:30:44.815397 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 01:30:44.887409 master-0 kubenswrapper[19170]: I0313 01:30:44.887338 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 01:30:44.893174 master-0 kubenswrapper[19170]: I0313 01:30:44.893111 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 01:30:44.895105 master-0 kubenswrapper[19170]: I0313 01:30:44.895070 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 01:30:44.896618 master-0 kubenswrapper[19170]: I0313 01:30:44.896588 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 01:30:44.911419 master-0 kubenswrapper[19170]: I0313 01:30:44.911357 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 01:30:44.921370 master-0 kubenswrapper[19170]: I0313 01:30:44.921317 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:44.921792 master-0 kubenswrapper[19170]: I0313 01:30:44.921384 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:44.992455 master-0 kubenswrapper[19170]: I0313 01:30:44.992393 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 01:30:45.002485 master-0 kubenswrapper[19170]: I0313 01:30:45.002402 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-3ahq1q95btnqo" Mar 13 01:30:45.028733 master-0 kubenswrapper[19170]: I0313 01:30:45.028661 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 01:30:45.073536 master-0 kubenswrapper[19170]: I0313 01:30:45.073353 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-ntqww" Mar 13 01:30:45.175471 master-0 kubenswrapper[19170]: I0313 01:30:45.175392 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 01:30:45.191974 master-0 kubenswrapper[19170]: I0313 01:30:45.191911 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 01:30:45.270180 master-0 kubenswrapper[19170]: I0313 01:30:45.270110 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 01:30:45.404978 master-0 kubenswrapper[19170]: I0313 01:30:45.404777 19170 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 01:30:45.416630 master-0 kubenswrapper[19170]: I0313 01:30:45.416566 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-console/console-864f84b8db-z7bgh"] Mar 13 01:30:45.416851 master-0 kubenswrapper[19170]: I0313 01:30:45.416682 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f9c97b86b-w5fxw","openshift-authentication/oauth-openshift-765798599f-r6mnk","openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 01:30:45.417089 master-0 kubenswrapper[19170]: E0313 01:30:45.417036 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" containerName="installer" Mar 13 01:30:45.417089 master-0 kubenswrapper[19170]: I0313 01:30:45.417067 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" containerName="installer" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: E0313 01:30:45.417103 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" containerName="installer" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: I0313 01:30:45.417119 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" containerName="installer" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: E0313 01:30:45.417141 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: I0313 01:30:45.417154 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: E0313 01:30:45.417191 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417281 master-0 kubenswrapper[19170]: I0313 01:30:45.417204 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417329 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417369 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="9273af3e-d978-486d-97ac-a7eef867cef1" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417460 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="75fcfe7a-43bd-4fc6-98f2-c04bd2db4378" containerName="installer" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417520 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417560 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" containerName="console" Mar 13 01:30:45.417881 master-0 kubenswrapper[19170]: I0313 01:30:45.417594 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4182669-e77a-4513-b19f-3b4bb618162e" containerName="installer" Mar 13 01:30:45.427097 master-0 kubenswrapper[19170]: I0313 01:30:45.419058 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.427097 master-0 kubenswrapper[19170]: I0313 01:30:45.419902 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.428449 master-0 kubenswrapper[19170]: I0313 01:30:45.428391 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 01:30:45.428830 master-0 kubenswrapper[19170]: I0313 01:30:45.428449 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-lkt5b" Mar 13 01:30:45.429437 master-0 kubenswrapper[19170]: I0313 01:30:45.429366 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 01:30:45.429999 master-0 kubenswrapper[19170]: I0313 01:30:45.429929 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 01:30:45.430159 master-0 kubenswrapper[19170]: I0313 01:30:45.430126 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 01:30:45.431402 master-0 kubenswrapper[19170]: I0313 01:30:45.430866 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 01:30:45.431402 master-0 kubenswrapper[19170]: I0313 01:30:45.431021 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 01:30:45.431402 master-0 kubenswrapper[19170]: I0313 01:30:45.431043 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 01:30:45.432186 master-0 kubenswrapper[19170]: I0313 01:30:45.432124 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 01:30:45.433953 master-0 kubenswrapper[19170]: I0313 01:30:45.433892 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 01:30:45.440524 master-0 kubenswrapper[19170]: I0313 01:30:45.439926 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 01:30:45.440524 master-0 kubenswrapper[19170]: I0313 01:30:45.440273 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 01:30:45.447975 master-0 kubenswrapper[19170]: I0313 01:30:45.444839 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-6f5hw" Mar 13 01:30:45.454020 master-0 kubenswrapper[19170]: I0313 01:30:45.453494 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 01:30:45.478891 master-0 kubenswrapper[19170]: I0313 01:30:45.478833 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 01:30:45.480150 master-0 kubenswrapper[19170]: I0313 01:30:45.480100 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdc3c693-5b70-44ef-b53d-7a546edd268c" path="/var/lib/kubelet/pods/cdc3c693-5b70-44ef-b53d-7a546edd268c/volumes" Mar 13 01:30:45.481055 master-0 kubenswrapper[19170]: I0313 01:30:45.480832 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:45.496432 master-0 kubenswrapper[19170]: I0313 01:30:45.496322 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.496286637 podStartE2EDuration="18.496286637s" podCreationTimestamp="2026-03-13 01:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:30:45.486605067 +0000 UTC m=+706.294726067" watchObservedRunningTime="2026-03-13 01:30:45.496286637 +0000 UTC m=+706.304407637" Mar 13 01:30:45.501074 master-0 kubenswrapper[19170]: I0313 01:30:45.500989 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501284 master-0 kubenswrapper[19170]: I0313 01:30:45.501125 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.501284 master-0 kubenswrapper[19170]: I0313 01:30:45.501212 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501424 master-0 kubenswrapper[19170]: I0313 01:30:45.501308 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501424 master-0 kubenswrapper[19170]: I0313 01:30:45.501361 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcsxq\" (UniqueName: \"kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.501559 master-0 kubenswrapper[19170]: I0313 01:30:45.501421 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-login\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501559 master-0 kubenswrapper[19170]: I0313 01:30:45.501454 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-service-ca\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501559 master-0 kubenswrapper[19170]: I0313 01:30:45.501487 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-router-certs\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501559 master-0 kubenswrapper[19170]: I0313 01:30:45.501552 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501613 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-policies\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501674 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-dir\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501732 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzt49\" (UniqueName: \"kubernetes.io/projected/a97dcee4-852f-4db3-8bac-1a813c162ce8-kube-api-access-xzt49\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501773 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-session\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501837 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-error\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.501878 master-0 kubenswrapper[19170]: I0313 01:30:45.501870 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.502215 master-0 kubenswrapper[19170]: I0313 01:30:45.501903 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.502215 master-0 kubenswrapper[19170]: I0313 01:30:45.501952 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.502215 master-0 kubenswrapper[19170]: I0313 01:30:45.501989 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.502215 master-0 kubenswrapper[19170]: I0313 01:30:45.502022 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.502215 master-0 kubenswrapper[19170]: I0313 01:30:45.502058 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.504040 master-0 kubenswrapper[19170]: I0313 01:30:45.503984 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 01:30:45.573045 master-0 kubenswrapper[19170]: I0313 01:30:45.572981 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 01:30:45.586354 master-0 kubenswrapper[19170]: I0313 01:30:45.586298 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 01:30:45.586576 master-0 kubenswrapper[19170]: I0313 01:30:45.586391 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608426 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608515 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-policies\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608566 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-dir\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608618 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzt49\" (UniqueName: \"kubernetes.io/projected/a97dcee4-852f-4db3-8bac-1a813c162ce8-kube-api-access-xzt49\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608693 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-session\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608757 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-error\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608804 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.608836 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609015 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609077 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609130 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609162 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609226 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609262 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609307 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609362 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609397 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcsxq\" (UniqueName: \"kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609454 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-login\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609482 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-service-ca\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.609509 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-router-certs\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.610187 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.610201 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.610298 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-dir\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.611506 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.612317 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.614587 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-router-certs\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.615466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-session\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.615524 master-0 kubenswrapper[19170]: I0313 01:30:45.615556 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 13 01:30:45.619525 master-0 kubenswrapper[19170]: I0313 01:30:45.617466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.619525 master-0 kubenswrapper[19170]: I0313 01:30:45.617935 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-audit-policies\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.619525 master-0 kubenswrapper[19170]: I0313 01:30:45.618343 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.619525 master-0 kubenswrapper[19170]: I0313 01:30:45.618669 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-service-ca\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.619525 master-0 kubenswrapper[19170]: I0313 01:30:45.619094 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.621041 master-0 kubenswrapper[19170]: I0313 01:30:45.620935 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-error\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.621134 master-0 kubenswrapper[19170]: I0313 01:30:45.621080 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 01:30:45.622012 master-0 kubenswrapper[19170]: I0313 01:30:45.621961 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.622107 master-0 kubenswrapper[19170]: I0313 01:30:45.621995 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.628848 master-0 kubenswrapper[19170]: I0313 01:30:45.622315 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.628848 master-0 kubenswrapper[19170]: I0313 01:30:45.624888 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.628848 master-0 kubenswrapper[19170]: I0313 01:30:45.628103 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a97dcee4-852f-4db3-8bac-1a813c162ce8-v4-0-config-user-template-login\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.634753 master-0 kubenswrapper[19170]: I0313 01:30:45.634701 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzt49\" (UniqueName: \"kubernetes.io/projected/a97dcee4-852f-4db3-8bac-1a813c162ce8-kube-api-access-xzt49\") pod \"oauth-openshift-765798599f-r6mnk\" (UID: \"a97dcee4-852f-4db3-8bac-1a813c162ce8\") " pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.646604 master-0 kubenswrapper[19170]: I0313 01:30:45.646543 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcsxq\" (UniqueName: \"kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq\") pod \"console-5f9c97b86b-w5fxw\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.692203 master-0 kubenswrapper[19170]: I0313 01:30:45.692129 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 13 01:30:45.776477 master-0 kubenswrapper[19170]: I0313 01:30:45.776036 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:45.780803 master-0 kubenswrapper[19170]: I0313 01:30:45.780348 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-sjhm5" Mar 13 01:30:45.797816 master-0 kubenswrapper[19170]: I0313 01:30:45.797760 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 13 01:30:45.805469 master-0 kubenswrapper[19170]: I0313 01:30:45.805438 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:45.818494 master-0 kubenswrapper[19170]: I0313 01:30:45.818354 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 01:30:45.846158 master-0 kubenswrapper[19170]: I0313 01:30:45.846131 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 01:30:45.957428 master-0 kubenswrapper[19170]: I0313 01:30:45.957254 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 01:30:46.016525 master-0 kubenswrapper[19170]: I0313 01:30:46.016473 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 01:30:46.083357 master-0 kubenswrapper[19170]: I0313 01:30:46.083294 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 01:30:46.138251 master-0 kubenswrapper[19170]: I0313 01:30:46.134391 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 01:30:46.170473 master-0 kubenswrapper[19170]: I0313 01:30:46.170411 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 01:30:46.184615 master-0 kubenswrapper[19170]: I0313 01:30:46.184555 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 01:30:46.184800 master-0 kubenswrapper[19170]: I0313 01:30:46.184618 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 01:30:46.360920 master-0 kubenswrapper[19170]: I0313 01:30:46.360762 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 01:30:46.367552 master-0 kubenswrapper[19170]: I0313 01:30:46.367336 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 01:30:46.389695 master-0 kubenswrapper[19170]: I0313 01:30:46.389431 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 01:30:46.411790 master-0 kubenswrapper[19170]: I0313 01:30:46.411722 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4lcm6" Mar 13 01:30:46.556850 master-0 kubenswrapper[19170]: I0313 01:30:46.556795 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 01:30:46.626867 master-0 kubenswrapper[19170]: I0313 01:30:46.626717 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 01:30:46.666847 master-0 kubenswrapper[19170]: I0313 01:30:46.666314 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 01:30:46.677453 master-0 kubenswrapper[19170]: I0313 01:30:46.677388 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-b6b87" Mar 13 01:30:46.728452 master-0 kubenswrapper[19170]: I0313 01:30:46.728376 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 01:30:46.778859 master-0 kubenswrapper[19170]: I0313 01:30:46.778778 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 01:30:46.790453 master-0 kubenswrapper[19170]: I0313 01:30:46.790379 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 01:30:46.822765 master-0 kubenswrapper[19170]: I0313 01:30:46.822721 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 01:30:46.880683 master-0 kubenswrapper[19170]: I0313 01:30:46.880540 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 01:30:46.936440 master-0 kubenswrapper[19170]: I0313 01:30:46.936380 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 01:30:47.017325 master-0 kubenswrapper[19170]: I0313 01:30:47.017272 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 01:30:47.164801 master-0 kubenswrapper[19170]: I0313 01:30:47.164767 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 01:30:47.173647 master-0 kubenswrapper[19170]: I0313 01:30:47.173372 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 01:30:47.179655 master-0 kubenswrapper[19170]: I0313 01:30:47.178715 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 01:30:47.213275 master-0 kubenswrapper[19170]: I0313 01:30:47.213193 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:30:47.266309 master-0 kubenswrapper[19170]: I0313 01:30:47.266254 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 01:30:47.365684 master-0 kubenswrapper[19170]: I0313 01:30:47.365594 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 01:30:47.430781 master-0 kubenswrapper[19170]: I0313 01:30:47.430619 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 01:30:47.467246 master-0 kubenswrapper[19170]: I0313 01:30:47.467195 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 01:30:47.478579 master-0 kubenswrapper[19170]: I0313 01:30:47.478530 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 01:30:47.489842 master-0 kubenswrapper[19170]: I0313 01:30:47.489758 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 01:30:47.584500 master-0 kubenswrapper[19170]: I0313 01:30:47.584435 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 01:30:47.587146 master-0 kubenswrapper[19170]: I0313 01:30:47.587116 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 01:30:47.617856 master-0 kubenswrapper[19170]: I0313 01:30:47.617819 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 01:30:47.639695 master-0 kubenswrapper[19170]: I0313 01:30:47.639662 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 01:30:47.652677 master-0 kubenswrapper[19170]: I0313 01:30:47.652654 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 01:30:47.785358 master-0 kubenswrapper[19170]: I0313 01:30:47.785209 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qdr99" Mar 13 01:30:47.840801 master-0 kubenswrapper[19170]: I0313 01:30:47.840732 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 01:30:47.849257 master-0 kubenswrapper[19170]: I0313 01:30:47.849167 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 01:30:47.866992 master-0 kubenswrapper[19170]: I0313 01:30:47.866907 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 01:30:47.876306 master-0 kubenswrapper[19170]: I0313 01:30:47.876248 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 01:30:47.920751 master-0 kubenswrapper[19170]: I0313 01:30:47.920660 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 01:30:48.012371 master-0 kubenswrapper[19170]: I0313 01:30:48.012256 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 01:30:48.024827 master-0 kubenswrapper[19170]: I0313 01:30:48.024721 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-vdkmw" Mar 13 01:30:48.024827 master-0 kubenswrapper[19170]: I0313 01:30:48.024830 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 01:30:48.026958 master-0 kubenswrapper[19170]: I0313 01:30:48.026888 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-rq7qf" Mar 13 01:30:48.058922 master-0 kubenswrapper[19170]: I0313 01:30:48.058807 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 01:30:48.061237 master-0 kubenswrapper[19170]: I0313 01:30:48.061192 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 01:30:48.066358 master-0 kubenswrapper[19170]: I0313 01:30:48.066300 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 01:30:48.090887 master-0 kubenswrapper[19170]: I0313 01:30:48.090835 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-7q6zv" Mar 13 01:30:48.108166 master-0 kubenswrapper[19170]: I0313 01:30:48.107878 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-m57n6" Mar 13 01:30:48.176442 master-0 kubenswrapper[19170]: I0313 01:30:48.175693 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 01:30:48.205684 master-0 kubenswrapper[19170]: I0313 01:30:48.205622 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 01:30:48.369274 master-0 kubenswrapper[19170]: I0313 01:30:48.369091 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 01:30:48.421955 master-0 kubenswrapper[19170]: I0313 01:30:48.421902 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 01:30:48.431729 master-0 kubenswrapper[19170]: I0313 01:30:48.431605 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 01:30:48.458082 master-0 kubenswrapper[19170]: I0313 01:30:48.458030 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 01:30:48.482528 master-0 kubenswrapper[19170]: I0313 01:30:48.482314 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 01:30:48.489576 master-0 kubenswrapper[19170]: I0313 01:30:48.489534 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 01:30:48.582032 master-0 kubenswrapper[19170]: I0313 01:30:48.581974 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 01:30:48.591712 master-0 kubenswrapper[19170]: I0313 01:30:48.588972 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 01:30:48.648394 master-0 kubenswrapper[19170]: I0313 01:30:48.648237 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 01:30:48.659616 master-0 kubenswrapper[19170]: I0313 01:30:48.659574 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 01:30:48.688516 master-0 kubenswrapper[19170]: I0313 01:30:48.688442 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 01:30:48.711887 master-0 kubenswrapper[19170]: I0313 01:30:48.711796 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 01:30:48.726367 master-0 kubenswrapper[19170]: I0313 01:30:48.726233 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 01:30:48.763646 master-0 kubenswrapper[19170]: I0313 01:30:48.763561 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-5qbrz" Mar 13 01:30:48.972266 master-0 kubenswrapper[19170]: I0313 01:30:48.972185 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-bjp2n" Mar 13 01:30:48.999708 master-0 kubenswrapper[19170]: I0313 01:30:48.997709 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 01:30:49.003862 master-0 kubenswrapper[19170]: I0313 01:30:49.003233 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 01:30:49.056544 master-0 kubenswrapper[19170]: I0313 01:30:49.056435 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 01:30:49.101895 master-0 kubenswrapper[19170]: I0313 01:30:49.101805 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 01:30:49.102546 master-0 kubenswrapper[19170]: I0313 01:30:49.102512 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 01:30:49.135015 master-0 kubenswrapper[19170]: E0313 01:30:49.134951 19170 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:30:49.135015 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-765798599f-r6mnk_openshift-authentication_a97dcee4-852f-4db3-8bac-1a813c162ce8_0(bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7): error adding pod openshift-authentication_oauth-openshift-765798599f-r6mnk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7" Netns:"/var/run/netns/ce45e1c1-5a35-49b8-a9dd-241cbb240a43" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-765798599f-r6mnk;K8S_POD_INFRA_CONTAINER_ID=bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7;K8S_POD_UID=a97dcee4-852f-4db3-8bac-1a813c162ce8" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-765798599f-r6mnk] networking: Multus: [openshift-authentication/oauth-openshift-765798599f-r6mnk/a97dcee4-852f-4db3-8bac-1a813c162ce8]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-765798599f-r6mnk in out of cluster comm: pod "oauth-openshift-765798599f-r6mnk" not found Mar 13 01:30:49.135015 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.135015 master-0 kubenswrapper[19170]: > Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: E0313 01:30:49.135053 19170 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-765798599f-r6mnk_openshift-authentication_a97dcee4-852f-4db3-8bac-1a813c162ce8_0(bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7): error adding pod openshift-authentication_oauth-openshift-765798599f-r6mnk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7" Netns:"/var/run/netns/ce45e1c1-5a35-49b8-a9dd-241cbb240a43" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-765798599f-r6mnk;K8S_POD_INFRA_CONTAINER_ID=bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7;K8S_POD_UID=a97dcee4-852f-4db3-8bac-1a813c162ce8" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-765798599f-r6mnk] networking: Multus: [openshift-authentication/oauth-openshift-765798599f-r6mnk/a97dcee4-852f-4db3-8bac-1a813c162ce8]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-765798599f-r6mnk in out of cluster comm: pod "oauth-openshift-765798599f-r6mnk" not found Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: > pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: E0313 01:30:49.135079 19170 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-765798599f-r6mnk_openshift-authentication_a97dcee4-852f-4db3-8bac-1a813c162ce8_0(bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7): error adding pod openshift-authentication_oauth-openshift-765798599f-r6mnk to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7" Netns:"/var/run/netns/ce45e1c1-5a35-49b8-a9dd-241cbb240a43" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-765798599f-r6mnk;K8S_POD_INFRA_CONTAINER_ID=bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7;K8S_POD_UID=a97dcee4-852f-4db3-8bac-1a813c162ce8" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-765798599f-r6mnk] networking: Multus: [openshift-authentication/oauth-openshift-765798599f-r6mnk/a97dcee4-852f-4db3-8bac-1a813c162ce8]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-765798599f-r6mnk in out of cluster comm: pod "oauth-openshift-765798599f-r6mnk" not found Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: > pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:30:49.135303 master-0 kubenswrapper[19170]: E0313 01:30:49.135143 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-765798599f-r6mnk_openshift-authentication(a97dcee4-852f-4db3-8bac-1a813c162ce8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-765798599f-r6mnk_openshift-authentication(a97dcee4-852f-4db3-8bac-1a813c162ce8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-765798599f-r6mnk_openshift-authentication_a97dcee4-852f-4db3-8bac-1a813c162ce8_0(bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7): error adding pod openshift-authentication_oauth-openshift-765798599f-r6mnk to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7\\\" Netns:\\\"/var/run/netns/ce45e1c1-5a35-49b8-a9dd-241cbb240a43\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-765798599f-r6mnk;K8S_POD_INFRA_CONTAINER_ID=bfa56c60bdf87611b2bd954fc1c5a6d3024f08927d73555dce8094c9dff557f7;K8S_POD_UID=a97dcee4-852f-4db3-8bac-1a813c162ce8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-765798599f-r6mnk] networking: Multus: [openshift-authentication/oauth-openshift-765798599f-r6mnk/a97dcee4-852f-4db3-8bac-1a813c162ce8]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-765798599f-r6mnk in out of cluster comm: pod \\\"oauth-openshift-765798599f-r6mnk\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" podUID="a97dcee4-852f-4db3-8bac-1a813c162ce8" Mar 13 01:30:49.143343 master-0 kubenswrapper[19170]: E0313 01:30:49.143278 19170 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 01:30:49.143343 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-5f9c97b86b-w5fxw_openshift-console_bc107bad-0393-441c-9815-09f27f25888c_0(c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb): error adding pod openshift-console_console-5f9c97b86b-w5fxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb" Netns:"/var/run/netns/e0e73022-fc85-4e3a-9c1d-8eef46aedc45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console;K8S_POD_NAME=console-5f9c97b86b-w5fxw;K8S_POD_INFRA_CONTAINER_ID=c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb;K8S_POD_UID=bc107bad-0393-441c-9815-09f27f25888c" Path:"" ERRORED: error configuring pod [openshift-console/console-5f9c97b86b-w5fxw] networking: Multus: [openshift-console/console-5f9c97b86b-w5fxw/bc107bad-0393-441c-9815-09f27f25888c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod console-5f9c97b86b-w5fxw in out of cluster comm: pod "console-5f9c97b86b-w5fxw" not found Mar 13 01:30:49.143343 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.143343 master-0 kubenswrapper[19170]: > Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: E0313 01:30:49.143346 19170 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-5f9c97b86b-w5fxw_openshift-console_bc107bad-0393-441c-9815-09f27f25888c_0(c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb): error adding pod openshift-console_console-5f9c97b86b-w5fxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb" Netns:"/var/run/netns/e0e73022-fc85-4e3a-9c1d-8eef46aedc45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console;K8S_POD_NAME=console-5f9c97b86b-w5fxw;K8S_POD_INFRA_CONTAINER_ID=c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb;K8S_POD_UID=bc107bad-0393-441c-9815-09f27f25888c" Path:"" ERRORED: error configuring pod [openshift-console/console-5f9c97b86b-w5fxw] networking: Multus: [openshift-console/console-5f9c97b86b-w5fxw/bc107bad-0393-441c-9815-09f27f25888c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod console-5f9c97b86b-w5fxw in out of cluster comm: pod "console-5f9c97b86b-w5fxw" not found Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: > pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: E0313 01:30:49.143369 19170 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-5f9c97b86b-w5fxw_openshift-console_bc107bad-0393-441c-9815-09f27f25888c_0(c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb): error adding pod openshift-console_console-5f9c97b86b-w5fxw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb" Netns:"/var/run/netns/e0e73022-fc85-4e3a-9c1d-8eef46aedc45" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console;K8S_POD_NAME=console-5f9c97b86b-w5fxw;K8S_POD_INFRA_CONTAINER_ID=c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb;K8S_POD_UID=bc107bad-0393-441c-9815-09f27f25888c" Path:"" ERRORED: error configuring pod [openshift-console/console-5f9c97b86b-w5fxw] networking: Multus: [openshift-console/console-5f9c97b86b-w5fxw/bc107bad-0393-441c-9815-09f27f25888c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod console-5f9c97b86b-w5fxw in out of cluster comm: pod "console-5f9c97b86b-w5fxw" not found Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: > pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:30:49.143551 master-0 kubenswrapper[19170]: E0313 01:30:49.143426 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"console-5f9c97b86b-w5fxw_openshift-console(bc107bad-0393-441c-9815-09f27f25888c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"console-5f9c97b86b-w5fxw_openshift-console(bc107bad-0393-441c-9815-09f27f25888c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_console-5f9c97b86b-w5fxw_openshift-console_bc107bad-0393-441c-9815-09f27f25888c_0(c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb): error adding pod openshift-console_console-5f9c97b86b-w5fxw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb\\\" Netns:\\\"/var/run/netns/e0e73022-fc85-4e3a-9c1d-8eef46aedc45\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-console;K8S_POD_NAME=console-5f9c97b86b-w5fxw;K8S_POD_INFRA_CONTAINER_ID=c8ae4442d0a8e5475b08755899141f61e8c6799c58dc4f0441035cb1ed63f5eb;K8S_POD_UID=bc107bad-0393-441c-9815-09f27f25888c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-console/console-5f9c97b86b-w5fxw] networking: Multus: [openshift-console/console-5f9c97b86b-w5fxw/bc107bad-0393-441c-9815-09f27f25888c]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod console-5f9c97b86b-w5fxw in out of cluster comm: pod \\\"console-5f9c97b86b-w5fxw\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-console/console-5f9c97b86b-w5fxw" podUID="bc107bad-0393-441c-9815-09f27f25888c" Mar 13 01:30:49.161674 master-0 kubenswrapper[19170]: I0313 01:30:49.161575 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 01:30:49.202159 master-0 kubenswrapper[19170]: I0313 01:30:49.202087 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 01:30:49.244372 master-0 kubenswrapper[19170]: I0313 01:30:49.238415 19170 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 01:30:49.251515 master-0 kubenswrapper[19170]: I0313 01:30:49.251464 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 01:30:49.273024 master-0 kubenswrapper[19170]: I0313 01:30:49.272964 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-lgs6j" Mar 13 01:30:49.310166 master-0 kubenswrapper[19170]: I0313 01:30:49.310101 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 01:30:49.315440 master-0 kubenswrapper[19170]: I0313 01:30:49.315382 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 13 01:30:49.334937 master-0 kubenswrapper[19170]: I0313 01:30:49.334873 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 01:30:49.340808 master-0 kubenswrapper[19170]: I0313 01:30:49.340765 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 01:30:49.382001 master-0 kubenswrapper[19170]: I0313 01:30:49.381911 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 01:30:49.403037 master-0 kubenswrapper[19170]: I0313 01:30:49.402967 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 01:30:49.430422 master-0 kubenswrapper[19170]: I0313 01:30:49.430362 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-ps7fb" Mar 13 01:30:49.469480 master-0 kubenswrapper[19170]: I0313 01:30:49.469390 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 01:30:49.606350 master-0 kubenswrapper[19170]: I0313 01:30:49.606175 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 01:30:49.632218 master-0 kubenswrapper[19170]: I0313 01:30:49.631010 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 01:30:49.656328 master-0 kubenswrapper[19170]: I0313 01:30:49.656286 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 01:30:49.743624 master-0 kubenswrapper[19170]: I0313 01:30:49.743535 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 01:30:49.783410 master-0 kubenswrapper[19170]: I0313 01:30:49.783336 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 01:30:49.791411 master-0 kubenswrapper[19170]: I0313 01:30:49.791362 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-bclr4" Mar 13 01:30:49.845653 master-0 kubenswrapper[19170]: I0313 01:30:49.843544 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 01:30:49.848724 master-0 kubenswrapper[19170]: I0313 01:30:49.848107 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 01:30:49.848724 master-0 kubenswrapper[19170]: I0313 01:30:49.848304 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" containerID="cri-o://c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7" gracePeriod=5 Mar 13 01:30:49.860567 master-0 kubenswrapper[19170]: I0313 01:30:49.860474 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 01:30:49.864019 master-0 kubenswrapper[19170]: I0313 01:30:49.863964 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-5fw2q" Mar 13 01:30:49.882200 master-0 kubenswrapper[19170]: I0313 01:30:49.882161 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 01:30:49.964833 master-0 kubenswrapper[19170]: I0313 01:30:49.964749 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 01:30:50.058566 master-0 kubenswrapper[19170]: I0313 01:30:50.058505 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 01:30:50.115388 master-0 kubenswrapper[19170]: I0313 01:30:50.115289 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 01:30:50.123696 master-0 kubenswrapper[19170]: I0313 01:30:50.123668 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 01:30:50.188063 master-0 kubenswrapper[19170]: I0313 01:30:50.188012 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 01:30:50.227431 master-0 kubenswrapper[19170]: I0313 01:30:50.227381 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 01:30:50.228998 master-0 kubenswrapper[19170]: I0313 01:30:50.228905 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 01:30:50.252430 master-0 kubenswrapper[19170]: I0313 01:30:50.252389 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 01:30:50.315823 master-0 kubenswrapper[19170]: I0313 01:30:50.315772 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 01:30:50.319865 master-0 kubenswrapper[19170]: I0313 01:30:50.319815 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 01:30:50.398817 master-0 kubenswrapper[19170]: I0313 01:30:50.398690 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-zdrdj" Mar 13 01:30:50.429440 master-0 kubenswrapper[19170]: I0313 01:30:50.429395 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 01:30:50.431205 master-0 kubenswrapper[19170]: I0313 01:30:50.431129 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 01:30:50.459797 master-0 kubenswrapper[19170]: I0313 01:30:50.459726 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 01:30:50.494499 master-0 kubenswrapper[19170]: I0313 01:30:50.494456 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 01:30:50.524837 master-0 kubenswrapper[19170]: I0313 01:30:50.521855 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 01:30:50.540765 master-0 kubenswrapper[19170]: I0313 01:30:50.539874 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 01:30:50.672921 master-0 kubenswrapper[19170]: I0313 01:30:50.672855 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-c7thh" Mar 13 01:30:50.685743 master-0 kubenswrapper[19170]: I0313 01:30:50.685712 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 01:30:50.709651 master-0 kubenswrapper[19170]: I0313 01:30:50.709586 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 01:30:50.802820 master-0 kubenswrapper[19170]: I0313 01:30:50.802754 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 01:30:50.898515 master-0 kubenswrapper[19170]: I0313 01:30:50.898449 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 01:30:50.901138 master-0 kubenswrapper[19170]: I0313 01:30:50.901085 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-n4b44" Mar 13 01:30:50.912483 master-0 kubenswrapper[19170]: I0313 01:30:50.912420 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 01:30:50.978335 master-0 kubenswrapper[19170]: I0313 01:30:50.978187 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-1fft5pqda64sn" Mar 13 01:30:50.995136 master-0 kubenswrapper[19170]: I0313 01:30:50.995050 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 01:30:51.047301 master-0 kubenswrapper[19170]: I0313 01:30:51.047224 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 01:30:51.052355 master-0 kubenswrapper[19170]: I0313 01:30:51.052310 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 01:30:51.239629 master-0 kubenswrapper[19170]: I0313 01:30:51.239470 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 01:30:51.336787 master-0 kubenswrapper[19170]: I0313 01:30:51.336714 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 01:30:51.348341 master-0 kubenswrapper[19170]: I0313 01:30:51.348281 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 01:30:51.360079 master-0 kubenswrapper[19170]: I0313 01:30:51.360004 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 01:30:51.564170 master-0 kubenswrapper[19170]: I0313 01:30:51.564018 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-56ljs" Mar 13 01:30:51.656888 master-0 kubenswrapper[19170]: I0313 01:30:51.656798 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 01:30:51.765156 master-0 kubenswrapper[19170]: I0313 01:30:51.765071 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 01:30:51.800204 master-0 kubenswrapper[19170]: I0313 01:30:51.800143 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 01:30:51.885747 master-0 kubenswrapper[19170]: I0313 01:30:51.885558 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 01:30:51.914118 master-0 kubenswrapper[19170]: I0313 01:30:51.914032 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 01:30:51.951354 master-0 kubenswrapper[19170]: I0313 01:30:51.951268 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 01:30:52.045622 master-0 kubenswrapper[19170]: I0313 01:30:52.045551 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 01:30:52.055471 master-0 kubenswrapper[19170]: I0313 01:30:52.055425 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-gtb4f" Mar 13 01:30:52.160897 master-0 kubenswrapper[19170]: I0313 01:30:52.160842 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 01:30:52.172808 master-0 kubenswrapper[19170]: I0313 01:30:52.172770 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 01:30:52.185613 master-0 kubenswrapper[19170]: I0313 01:30:52.185567 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 01:30:52.388667 master-0 kubenswrapper[19170]: I0313 01:30:52.388569 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 01:30:52.399786 master-0 kubenswrapper[19170]: I0313 01:30:52.399722 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 01:30:52.409614 master-0 kubenswrapper[19170]: I0313 01:30:52.409567 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 01:30:52.439052 master-0 kubenswrapper[19170]: I0313 01:30:52.438869 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 01:30:52.509269 master-0 kubenswrapper[19170]: I0313 01:30:52.509181 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 01:30:52.585772 master-0 kubenswrapper[19170]: I0313 01:30:52.585703 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 01:30:52.752297 master-0 kubenswrapper[19170]: I0313 01:30:52.752146 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-jf79b" Mar 13 01:30:52.821990 master-0 kubenswrapper[19170]: I0313 01:30:52.821737 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 01:30:52.860099 master-0 kubenswrapper[19170]: I0313 01:30:52.860002 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 01:30:52.883817 master-0 kubenswrapper[19170]: I0313 01:30:52.883760 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 01:30:52.886142 master-0 kubenswrapper[19170]: I0313 01:30:52.886091 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 01:30:53.065368 master-0 kubenswrapper[19170]: I0313 01:30:53.065208 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 01:30:53.079831 master-0 kubenswrapper[19170]: I0313 01:30:53.079784 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 01:30:53.133232 master-0 kubenswrapper[19170]: I0313 01:30:53.133185 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 01:30:53.183366 master-0 kubenswrapper[19170]: I0313 01:30:53.183327 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 01:30:53.191051 master-0 kubenswrapper[19170]: I0313 01:30:53.190979 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-sb45h" Mar 13 01:30:53.193078 master-0 kubenswrapper[19170]: I0313 01:30:53.193021 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 01:30:53.336762 master-0 kubenswrapper[19170]: I0313 01:30:53.336674 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 13 01:30:53.351946 master-0 kubenswrapper[19170]: I0313 01:30:53.351923 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 01:30:53.534557 master-0 kubenswrapper[19170]: I0313 01:30:53.534492 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 01:30:53.541227 master-0 kubenswrapper[19170]: I0313 01:30:53.541176 19170 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 01:30:53.691337 master-0 kubenswrapper[19170]: I0313 01:30:53.691253 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 01:30:53.854941 master-0 kubenswrapper[19170]: I0313 01:30:53.854878 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 01:30:53.900672 master-0 kubenswrapper[19170]: I0313 01:30:53.900574 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 01:30:54.338472 master-0 kubenswrapper[19170]: I0313 01:30:54.338412 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 01:30:54.420055 master-0 kubenswrapper[19170]: I0313 01:30:54.419475 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:30:54.435175 master-0 kubenswrapper[19170]: I0313 01:30:54.435118 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 01:30:54.453947 master-0 kubenswrapper[19170]: I0313 01:30:54.453907 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-h6wj2" Mar 13 01:30:54.467837 master-0 kubenswrapper[19170]: I0313 01:30:54.467802 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 01:30:54.516406 master-0 kubenswrapper[19170]: I0313 01:30:54.516360 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 01:30:54.628739 master-0 kubenswrapper[19170]: I0313 01:30:54.628626 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 01:30:54.916533 master-0 kubenswrapper[19170]: I0313 01:30:54.916463 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 01:30:54.919153 master-0 kubenswrapper[19170]: I0313 01:30:54.919072 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:30:54.919323 master-0 kubenswrapper[19170]: I0313 01:30:54.919148 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:30:55.019359 master-0 kubenswrapper[19170]: I0313 01:30:55.019306 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 13 01:30:55.019928 master-0 kubenswrapper[19170]: I0313 01:30:55.019892 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:55.086436 master-0 kubenswrapper[19170]: I0313 01:30:55.086362 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 01:30:55.086436 master-0 kubenswrapper[19170]: I0313 01:30:55.086438 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086518 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086555 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086613 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests" (OuterVolumeSpecName: "manifests") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086699 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log" (OuterVolumeSpecName: "var-log") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086741 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086767 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:55.086857 master-0 kubenswrapper[19170]: I0313 01:30:55.086826 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock" (OuterVolumeSpecName: "var-lock") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:55.087305 master-0 kubenswrapper[19170]: I0313 01:30:55.087238 19170 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:55.087305 master-0 kubenswrapper[19170]: I0313 01:30:55.087263 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:55.087305 master-0 kubenswrapper[19170]: I0313 01:30:55.087282 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:55.087305 master-0 kubenswrapper[19170]: I0313 01:30:55.087299 19170 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:55.088401 master-0 kubenswrapper[19170]: I0313 01:30:55.088359 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 01:30:55.094389 master-0 kubenswrapper[19170]: I0313 01:30:55.094311 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:30:55.189649 master-0 kubenswrapper[19170]: I0313 01:30:55.189547 19170 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:30:55.329625 master-0 kubenswrapper[19170]: I0313 01:30:55.329532 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 01:30:55.435300 master-0 kubenswrapper[19170]: I0313 01:30:55.435209 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814bd60de133d95cf99630a978c017e" path="/var/lib/kubelet/pods/a814bd60de133d95cf99630a978c017e/volumes" Mar 13 01:30:55.496455 master-0 kubenswrapper[19170]: I0313 01:30:55.496357 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 01:30:55.686281 master-0 kubenswrapper[19170]: I0313 01:30:55.686133 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/3.log" Mar 13 01:30:55.687252 master-0 kubenswrapper[19170]: I0313 01:30:55.687210 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:30:55.688353 master-0 kubenswrapper[19170]: I0313 01:30:55.688311 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"43028f0e2cfc9ffb600b4d08ad84e12d","Type":"ContainerStarted","Data":"1c6a2b5cf446f4c677619876139e96d76bcf0a96b3e0347a7104ea8e63c6e987"} Mar 13 01:30:55.693357 master-0 kubenswrapper[19170]: I0313 01:30:55.693312 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 13 01:30:55.693501 master-0 kubenswrapper[19170]: I0313 01:30:55.693364 19170 generic.go:334] "Generic (PLEG): container finished" podID="a814bd60de133d95cf99630a978c017e" containerID="c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7" exitCode=137 Mar 13 01:30:55.693501 master-0 kubenswrapper[19170]: I0313 01:30:55.693407 19170 scope.go:117] "RemoveContainer" containerID="c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7" Mar 13 01:30:55.693666 master-0 kubenswrapper[19170]: I0313 01:30:55.693511 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 01:30:55.762067 master-0 kubenswrapper[19170]: I0313 01:30:55.761955 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 01:30:55.762377 master-0 kubenswrapper[19170]: I0313 01:30:55.762286 19170 scope.go:117] "RemoveContainer" containerID="c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7" Mar 13 01:30:55.763006 master-0 kubenswrapper[19170]: E0313 01:30:55.762955 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7\": container with ID starting with c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7 not found: ID does not exist" containerID="c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7" Mar 13 01:30:55.763066 master-0 kubenswrapper[19170]: I0313 01:30:55.763020 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7"} err="failed to get container status \"c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7\": rpc error: code = NotFound desc = could not find container \"c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7\": container with ID starting with c0fe29457932959c68b194aa61b73366ddd4f141247e68871dd5b8341005a8a7 not found: ID does not exist" Mar 13 01:30:56.482446 master-0 kubenswrapper[19170]: I0313 01:30:56.482404 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:56.482785 master-0 kubenswrapper[19170]: I0313 01:30:56.482768 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:56.487229 master-0 kubenswrapper[19170]: I0313 01:30:56.487179 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:30:56.502366 master-0 kubenswrapper[19170]: I0313 01:30:56.502309 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 01:31:01.419506 master-0 kubenswrapper[19170]: I0313 01:31:01.419368 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:31:01.420768 master-0 kubenswrapper[19170]: I0313 01:31:01.420702 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:31:01.958872 master-0 kubenswrapper[19170]: I0313 01:31:01.958799 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-765798599f-r6mnk"] Mar 13 01:31:02.761181 master-0 kubenswrapper[19170]: I0313 01:31:02.761086 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" event={"ID":"a97dcee4-852f-4db3-8bac-1a813c162ce8","Type":"ContainerStarted","Data":"57173d19305342231a3e8e419c72ee49912c8b6fdf4a095c23b221b03f341a53"} Mar 13 01:31:02.761181 master-0 kubenswrapper[19170]: I0313 01:31:02.761150 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" event={"ID":"a97dcee4-852f-4db3-8bac-1a813c162ce8","Type":"ContainerStarted","Data":"a6a1c05c7b30de6cdbfe772027d924743cff661e495540f3e4a2cead788155f4"} Mar 13 01:31:02.761859 master-0 kubenswrapper[19170]: I0313 01:31:02.761502 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:31:02.768921 master-0 kubenswrapper[19170]: I0313 01:31:02.768843 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" Mar 13 01:31:02.800771 master-0 kubenswrapper[19170]: I0313 01:31:02.800661 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-765798599f-r6mnk" podStartSLOduration=66.800612349 podStartE2EDuration="1m6.800612349s" podCreationTimestamp="2026-03-13 01:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:31:02.792535994 +0000 UTC m=+723.600657004" watchObservedRunningTime="2026-03-13 01:31:02.800612349 +0000 UTC m=+723.608733339" Mar 13 01:31:03.418626 master-0 kubenswrapper[19170]: I0313 01:31:03.418561 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:03.419754 master-0 kubenswrapper[19170]: I0313 01:31:03.419721 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:03.981334 master-0 kubenswrapper[19170]: I0313 01:31:03.981281 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f9c97b86b-w5fxw"] Mar 13 01:31:03.987032 master-0 kubenswrapper[19170]: W0313 01:31:03.985713 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc107bad_0393_441c_9815_09f27f25888c.slice/crio-7828912a4b410f6e3e56c4797442ca59575620dabdb98ea0cf4e79ec90797d9e WatchSource:0}: Error finding container 7828912a4b410f6e3e56c4797442ca59575620dabdb98ea0cf4e79ec90797d9e: Status 404 returned error can't find the container with id 7828912a4b410f6e3e56c4797442ca59575620dabdb98ea0cf4e79ec90797d9e Mar 13 01:31:04.783878 master-0 kubenswrapper[19170]: I0313 01:31:04.783820 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9c97b86b-w5fxw" event={"ID":"bc107bad-0393-441c-9815-09f27f25888c","Type":"ContainerStarted","Data":"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566"} Mar 13 01:31:04.783878 master-0 kubenswrapper[19170]: I0313 01:31:04.783868 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9c97b86b-w5fxw" event={"ID":"bc107bad-0393-441c-9815-09f27f25888c","Type":"ContainerStarted","Data":"7828912a4b410f6e3e56c4797442ca59575620dabdb98ea0cf4e79ec90797d9e"} Mar 13 01:31:04.820062 master-0 kubenswrapper[19170]: I0313 01:31:04.819953 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f9c97b86b-w5fxw" podStartSLOduration=69.819928913 podStartE2EDuration="1m9.819928913s" podCreationTimestamp="2026-03-13 01:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:31:04.812260529 +0000 UTC m=+725.620381519" watchObservedRunningTime="2026-03-13 01:31:04.819928913 +0000 UTC m=+725.628049913" Mar 13 01:31:04.919223 master-0 kubenswrapper[19170]: I0313 01:31:04.919149 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:31:04.919497 master-0 kubenswrapper[19170]: I0313 01:31:04.919227 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:31:05.777252 master-0 kubenswrapper[19170]: I0313 01:31:05.777191 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:05.777252 master-0 kubenswrapper[19170]: I0313 01:31:05.777251 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:05.780778 master-0 kubenswrapper[19170]: I0313 01:31:05.780159 19170 patch_prober.go:28] interesting pod/console-5f9c97b86b-w5fxw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.113:8443/health\": dial tcp 10.128.0.113:8443: connect: connection refused" start-of-body= Mar 13 01:31:05.780778 master-0 kubenswrapper[19170]: I0313 01:31:05.780256 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5f9c97b86b-w5fxw" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.113:8443/health\": dial tcp 10.128.0.113:8443: connect: connection refused" Mar 13 01:31:06.492911 master-0 kubenswrapper[19170]: I0313 01:31:06.492809 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:31:14.920047 master-0 kubenswrapper[19170]: I0313 01:31:14.919946 19170 patch_prober.go:28] interesting pod/console-6c969fc7db-l2cgv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" start-of-body= Mar 13 01:31:14.920047 master-0 kubenswrapper[19170]: I0313 01:31:14.920038 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.105:8443/health\": dial tcp 10.128.0.105:8443: connect: connection refused" Mar 13 01:31:15.778163 master-0 kubenswrapper[19170]: I0313 01:31:15.778054 19170 patch_prober.go:28] interesting pod/console-5f9c97b86b-w5fxw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.113:8443/health\": dial tcp 10.128.0.113:8443: connect: connection refused" start-of-body= Mar 13 01:31:15.778492 master-0 kubenswrapper[19170]: I0313 01:31:15.778183 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5f9c97b86b-w5fxw" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" probeResult="failure" output="Get \"https://10.128.0.113:8443/health\": dial tcp 10.128.0.113:8443: connect: connection refused" Mar 13 01:31:24.926517 master-0 kubenswrapper[19170]: I0313 01:31:24.926431 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:31:24.933730 master-0 kubenswrapper[19170]: I0313 01:31:24.933625 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:31:25.803817 master-0 kubenswrapper[19170]: I0313 01:31:25.803752 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:25.814807 master-0 kubenswrapper[19170]: I0313 01:31:25.814746 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:31:25.959221 master-0 kubenswrapper[19170]: I0313 01:31:25.959132 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:31:51.012055 master-0 kubenswrapper[19170]: I0313 01:31:51.011955 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6c969fc7db-l2cgv" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" containerID="cri-o://41d1bd37ad97da541b70877816e85ac012c2da5f02e8d22110de3185141c4de4" gracePeriod=15 Mar 13 01:31:51.199855 master-0 kubenswrapper[19170]: I0313 01:31:51.199784 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c969fc7db-l2cgv_5517e790-8931-405e-a113-b4d76156775c/console/1.log" Mar 13 01:31:51.202736 master-0 kubenswrapper[19170]: I0313 01:31:51.201194 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c969fc7db-l2cgv_5517e790-8931-405e-a113-b4d76156775c/console/0.log" Mar 13 01:31:51.202736 master-0 kubenswrapper[19170]: I0313 01:31:51.201278 19170 generic.go:334] "Generic (PLEG): container finished" podID="5517e790-8931-405e-a113-b4d76156775c" containerID="41d1bd37ad97da541b70877816e85ac012c2da5f02e8d22110de3185141c4de4" exitCode=2 Mar 13 01:31:51.202736 master-0 kubenswrapper[19170]: I0313 01:31:51.201327 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerDied","Data":"41d1bd37ad97da541b70877816e85ac012c2da5f02e8d22110de3185141c4de4"} Mar 13 01:31:51.202736 master-0 kubenswrapper[19170]: I0313 01:31:51.201391 19170 scope.go:117] "RemoveContainer" containerID="4b70241fd7a26ba09f8c0188412cee31344d4af634ea5235fa3b30080a5f8a36" Mar 13 01:31:51.514862 master-0 kubenswrapper[19170]: I0313 01:31:51.514814 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c969fc7db-l2cgv_5517e790-8931-405e-a113-b4d76156775c/console/1.log" Mar 13 01:31:51.515062 master-0 kubenswrapper[19170]: I0313 01:31:51.514885 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:31:51.658466 master-0 kubenswrapper[19170]: I0313 01:31:51.658369 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.658815 master-0 kubenswrapper[19170]: I0313 01:31:51.658520 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.658815 master-0 kubenswrapper[19170]: I0313 01:31:51.658573 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.658815 master-0 kubenswrapper[19170]: I0313 01:31:51.658726 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.659025 master-0 kubenswrapper[19170]: I0313 01:31:51.658887 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.659088 master-0 kubenswrapper[19170]: I0313 01:31:51.659012 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8q25\" (UniqueName: \"kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.659226 master-0 kubenswrapper[19170]: I0313 01:31:51.659175 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert\") pod \"5517e790-8931-405e-a113-b4d76156775c\" (UID: \"5517e790-8931-405e-a113-b4d76156775c\") " Mar 13 01:31:51.659891 master-0 kubenswrapper[19170]: I0313 01:31:51.659810 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:31:51.660821 master-0 kubenswrapper[19170]: I0313 01:31:51.660768 19170 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.661211 master-0 kubenswrapper[19170]: I0313 01:31:51.661145 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:31:51.662686 master-0 kubenswrapper[19170]: I0313 01:31:51.662592 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca" (OuterVolumeSpecName: "service-ca") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:31:51.662686 master-0 kubenswrapper[19170]: I0313 01:31:51.662660 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config" (OuterVolumeSpecName: "console-config") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:31:51.666771 master-0 kubenswrapper[19170]: I0313 01:31:51.666721 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:31:51.667505 master-0 kubenswrapper[19170]: I0313 01:31:51.667434 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25" (OuterVolumeSpecName: "kube-api-access-l8q25") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "kube-api-access-l8q25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:31:51.675931 master-0 kubenswrapper[19170]: I0313 01:31:51.675829 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5517e790-8931-405e-a113-b4d76156775c" (UID: "5517e790-8931-405e-a113-b4d76156775c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:31:51.762188 master-0 kubenswrapper[19170]: I0313 01:31:51.762087 19170 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.762188 master-0 kubenswrapper[19170]: I0313 01:31:51.762148 19170 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.762188 master-0 kubenswrapper[19170]: I0313 01:31:51.762166 19170 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5517e790-8931-405e-a113-b4d76156775c-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.762188 master-0 kubenswrapper[19170]: I0313 01:31:51.762181 19170 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.762188 master-0 kubenswrapper[19170]: I0313 01:31:51.762196 19170 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5517e790-8931-405e-a113-b4d76156775c-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:51.762738 master-0 kubenswrapper[19170]: I0313 01:31:51.762220 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8q25\" (UniqueName: \"kubernetes.io/projected/5517e790-8931-405e-a113-b4d76156775c-kube-api-access-l8q25\") on node \"master-0\" DevicePath \"\"" Mar 13 01:31:52.213973 master-0 kubenswrapper[19170]: I0313 01:31:52.213887 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c969fc7db-l2cgv_5517e790-8931-405e-a113-b4d76156775c/console/1.log" Mar 13 01:31:52.214962 master-0 kubenswrapper[19170]: I0313 01:31:52.213992 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c969fc7db-l2cgv" event={"ID":"5517e790-8931-405e-a113-b4d76156775c","Type":"ContainerDied","Data":"5774d7ee0c2fc3d6f3bef6c5ff141866de5419716d8c9089b9bad9faae4abb6f"} Mar 13 01:31:52.214962 master-0 kubenswrapper[19170]: I0313 01:31:52.214049 19170 scope.go:117] "RemoveContainer" containerID="41d1bd37ad97da541b70877816e85ac012c2da5f02e8d22110de3185141c4de4" Mar 13 01:31:52.214962 master-0 kubenswrapper[19170]: I0313 01:31:52.214095 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c969fc7db-l2cgv" Mar 13 01:31:52.271194 master-0 kubenswrapper[19170]: I0313 01:31:52.271087 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:31:52.281028 master-0 kubenswrapper[19170]: I0313 01:31:52.280927 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c969fc7db-l2cgv"] Mar 13 01:31:53.431411 master-0 kubenswrapper[19170]: I0313 01:31:53.431338 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5517e790-8931-405e-a113-b4d76156775c" path="/var/lib/kubelet/pods/5517e790-8931-405e-a113-b4d76156775c/volumes" Mar 13 01:31:53.519237 master-0 kubenswrapper[19170]: I0313 01:31:53.519200 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-2-master-0"] Mar 13 01:31:53.519674 master-0 kubenswrapper[19170]: E0313 01:31:53.519658 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.519759 master-0 kubenswrapper[19170]: I0313 01:31:53.519747 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.519820 master-0 kubenswrapper[19170]: E0313 01:31:53.519810 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 01:31:53.519876 master-0 kubenswrapper[19170]: I0313 01:31:53.519867 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 01:31:53.519949 master-0 kubenswrapper[19170]: E0313 01:31:53.519940 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.520001 master-0 kubenswrapper[19170]: I0313 01:31:53.519991 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.520200 master-0 kubenswrapper[19170]: I0313 01:31:53.520187 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.520279 master-0 kubenswrapper[19170]: I0313 01:31:53.520269 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 01:31:53.520348 master-0 kubenswrapper[19170]: I0313 01:31:53.520339 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="5517e790-8931-405e-a113-b4d76156775c" containerName="console" Mar 13 01:31:53.520834 master-0 kubenswrapper[19170]: I0313 01:31:53.520818 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.523647 master-0 kubenswrapper[19170]: I0313 01:31:53.523603 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-ctg9b" Mar 13 01:31:53.523797 master-0 kubenswrapper[19170]: I0313 01:31:53.523782 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 01:31:53.537699 master-0 kubenswrapper[19170]: I0313 01:31:53.537621 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-2-master-0"] Mar 13 01:31:53.696401 master-0 kubenswrapper[19170]: I0313 01:31:53.696293 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.696401 master-0 kubenswrapper[19170]: I0313 01:31:53.696396 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.696610 master-0 kubenswrapper[19170]: I0313 01:31:53.696544 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.799182 master-0 kubenswrapper[19170]: I0313 01:31:53.799106 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.799429 master-0 kubenswrapper[19170]: I0313 01:31:53.799205 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.799429 master-0 kubenswrapper[19170]: I0313 01:31:53.799234 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.799429 master-0 kubenswrapper[19170]: I0313 01:31:53.799266 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.799591 master-0 kubenswrapper[19170]: I0313 01:31:53.799528 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.817399 master-0 kubenswrapper[19170]: I0313 01:31:53.817328 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access\") pod \"installer-5-retry-2-master-0\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:53.857687 master-0 kubenswrapper[19170]: I0313 01:31:53.857619 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:31:54.339578 master-0 kubenswrapper[19170]: W0313 01:31:54.339496 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeedaadd9_adde_40e7_adbb_9bde5930d325.slice/crio-afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493 WatchSource:0}: Error finding container afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493: Status 404 returned error can't find the container with id afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493 Mar 13 01:31:54.341036 master-0 kubenswrapper[19170]: I0313 01:31:54.340991 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-retry-2-master-0"] Mar 13 01:31:55.238022 master-0 kubenswrapper[19170]: I0313 01:31:55.237954 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" event={"ID":"eedaadd9-adde-40e7-adbb-9bde5930d325","Type":"ContainerStarted","Data":"f90d457bd81cdee5c0dc64211552661301605b8210f04153cf52c5ec1dd728a1"} Mar 13 01:31:55.238022 master-0 kubenswrapper[19170]: I0313 01:31:55.238032 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" event={"ID":"eedaadd9-adde-40e7-adbb-9bde5930d325","Type":"ContainerStarted","Data":"afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493"} Mar 13 01:31:55.267479 master-0 kubenswrapper[19170]: I0313 01:31:55.267372 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" podStartSLOduration=2.267350115 podStartE2EDuration="2.267350115s" podCreationTimestamp="2026-03-13 01:31:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:31:55.258501748 +0000 UTC m=+776.066622748" watchObservedRunningTime="2026-03-13 01:31:55.267350115 +0000 UTC m=+776.075471085" Mar 13 01:32:27.657489 master-0 kubenswrapper[19170]: I0313 01:32:27.657424 19170 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:32:27.658393 master-0 kubenswrapper[19170]: I0313 01:32:27.657845 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://ad501dba73c8e02ebebbb7bcaac88dd3b49aa2dc75d4621c004e56223548c569" gracePeriod=30 Mar 13 01:32:27.658393 master-0 kubenswrapper[19170]: I0313 01:32:27.657948 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" containerID="cri-o://c2b7c2cb0043a7acf4d6058427bbcbbdcf69fdb50f1988ca8bbd9592ceaee922" gracePeriod=30 Mar 13 01:32:27.658393 master-0 kubenswrapper[19170]: I0313 01:32:27.657965 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" containerID="cri-o://1c6a2b5cf446f4c677619876139e96d76bcf0a96b3e0347a7104ea8e63c6e987" gracePeriod=30 Mar 13 01:32:27.658393 master-0 kubenswrapper[19170]: I0313 01:32:27.657984 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://2aa87006447655fcce0c7dda89b0736fa8fe5d6d6b3d7992f1e605fe121770e9" gracePeriod=30 Mar 13 01:32:27.660176 master-0 kubenswrapper[19170]: I0313 01:32:27.660127 19170 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:32:27.660566 master-0 kubenswrapper[19170]: E0313 01:32:27.660518 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.660566 master-0 kubenswrapper[19170]: I0313 01:32:27.660562 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: E0313 01:32:27.660582 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: I0313 01:32:27.660594 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: E0313 01:32:27.660612 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-cert-syncer" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: I0313 01:32:27.660623 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-cert-syncer" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: E0313 01:32:27.660663 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: I0313 01:32:27.660674 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.660699 master-0 kubenswrapper[19170]: E0313 01:32:27.660694 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: I0313 01:32:27.660703 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: E0313 01:32:27.660737 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-recovery-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: I0313 01:32:27.660748 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-recovery-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: E0313 01:32:27.660761 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: I0313 01:32:27.660770 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: E0313 01:32:27.660791 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.660959 master-0 kubenswrapper[19170]: I0313 01:32:27.660801 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661033 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-recovery-controller" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661059 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661075 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661091 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661105 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661117 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661138 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager-cert-syncer" Mar 13 01:32:27.661208 master-0 kubenswrapper[19170]: I0313 01:32:27.661164 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.661498 master-0 kubenswrapper[19170]: E0313 01:32:27.661361 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661498 master-0 kubenswrapper[19170]: I0313 01:32:27.661375 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661498 master-0 kubenswrapper[19170]: E0313 01:32:27.661399 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661498 master-0 kubenswrapper[19170]: I0313 01:32:27.661409 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661659 master-0 kubenswrapper[19170]: I0313 01:32:27.661607 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="kube-controller-manager" Mar 13 01:32:27.661711 master-0 kubenswrapper[19170]: I0313 01:32:27.661660 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" containerName="cluster-policy-controller" Mar 13 01:32:27.784263 master-0 kubenswrapper[19170]: I0313 01:32:27.783950 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.784263 master-0 kubenswrapper[19170]: I0313 01:32:27.784189 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.839576 master-0 kubenswrapper[19170]: I0313 01:32:27.839508 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/3.log" Mar 13 01:32:27.840685 master-0 kubenswrapper[19170]: I0313 01:32:27.840607 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:32:27.843426 master-0 kubenswrapper[19170]: I0313 01:32:27.841920 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager-cert-syncer/0.log" Mar 13 01:32:27.843426 master-0 kubenswrapper[19170]: I0313 01:32:27.842044 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.845541 master-0 kubenswrapper[19170]: I0313 01:32:27.845463 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="43028f0e2cfc9ffb600b4d08ad84e12d" podUID="eb4d3da469837c83afa0bab0e0bd1451" Mar 13 01:32:27.885894 master-0 kubenswrapper[19170]: I0313 01:32:27.885837 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.885894 master-0 kubenswrapper[19170]: I0313 01:32:27.885859 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.886161 master-0 kubenswrapper[19170]: I0313 01:32:27.885956 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.886161 master-0 kubenswrapper[19170]: I0313 01:32:27.886127 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/eb4d3da469837c83afa0bab0e0bd1451-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"eb4d3da469837c83afa0bab0e0bd1451\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:27.988171 master-0 kubenswrapper[19170]: I0313 01:32:27.988087 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir\") pod \"43028f0e2cfc9ffb600b4d08ad84e12d\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " Mar 13 01:32:27.988429 master-0 kubenswrapper[19170]: I0313 01:32:27.988204 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir\") pod \"43028f0e2cfc9ffb600b4d08ad84e12d\" (UID: \"43028f0e2cfc9ffb600b4d08ad84e12d\") " Mar 13 01:32:27.988429 master-0 kubenswrapper[19170]: I0313 01:32:27.988216 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "43028f0e2cfc9ffb600b4d08ad84e12d" (UID: "43028f0e2cfc9ffb600b4d08ad84e12d"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:32:27.988429 master-0 kubenswrapper[19170]: I0313 01:32:27.988383 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "43028f0e2cfc9ffb600b4d08ad84e12d" (UID: "43028f0e2cfc9ffb600b4d08ad84e12d"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:32:27.988815 master-0 kubenswrapper[19170]: I0313 01:32:27.988759 19170 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:32:27.988815 master-0 kubenswrapper[19170]: I0313 01:32:27.988801 19170 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/43028f0e2cfc9ffb600b4d08ad84e12d-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:32:28.549961 master-0 kubenswrapper[19170]: I0313 01:32:28.549893 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager/3.log" Mar 13 01:32:28.551321 master-0 kubenswrapper[19170]: I0313 01:32:28.551264 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/cluster-policy-controller/1.log" Mar 13 01:32:28.553708 master-0 kubenswrapper[19170]: I0313 01:32:28.553614 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager-cert-syncer/0.log" Mar 13 01:32:28.553949 master-0 kubenswrapper[19170]: I0313 01:32:28.553721 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="1c6a2b5cf446f4c677619876139e96d76bcf0a96b3e0347a7104ea8e63c6e987" exitCode=0 Mar 13 01:32:28.553949 master-0 kubenswrapper[19170]: I0313 01:32:28.553752 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="c2b7c2cb0043a7acf4d6058427bbcbbdcf69fdb50f1988ca8bbd9592ceaee922" exitCode=0 Mar 13 01:32:28.553949 master-0 kubenswrapper[19170]: I0313 01:32:28.553768 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="2aa87006447655fcce0c7dda89b0736fa8fe5d6d6b3d7992f1e605fe121770e9" exitCode=0 Mar 13 01:32:28.553949 master-0 kubenswrapper[19170]: I0313 01:32:28.553784 19170 generic.go:334] "Generic (PLEG): container finished" podID="43028f0e2cfc9ffb600b4d08ad84e12d" containerID="ad501dba73c8e02ebebbb7bcaac88dd3b49aa2dc75d4621c004e56223548c569" exitCode=2 Mar 13 01:32:28.554357 master-0 kubenswrapper[19170]: I0313 01:32:28.553962 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:28.554357 master-0 kubenswrapper[19170]: I0313 01:32:28.554059 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f8bb415a20f5d2771ed5dc744f9982ea8e0a80ebb3d2449dda00410833c158" Mar 13 01:32:28.554357 master-0 kubenswrapper[19170]: I0313 01:32:28.554095 19170 scope.go:117] "RemoveContainer" containerID="755598de3b9faacaa21c1f7d1498ac6b7d167334023e5ad7e7cf8e4e0c71e46e" Mar 13 01:32:28.557562 master-0 kubenswrapper[19170]: I0313 01:32:28.557468 19170 generic.go:334] "Generic (PLEG): container finished" podID="eedaadd9-adde-40e7-adbb-9bde5930d325" containerID="f90d457bd81cdee5c0dc64211552661301605b8210f04153cf52c5ec1dd728a1" exitCode=0 Mar 13 01:32:28.557562 master-0 kubenswrapper[19170]: I0313 01:32:28.557543 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" event={"ID":"eedaadd9-adde-40e7-adbb-9bde5930d325","Type":"ContainerDied","Data":"f90d457bd81cdee5c0dc64211552661301605b8210f04153cf52c5ec1dd728a1"} Mar 13 01:32:28.558951 master-0 kubenswrapper[19170]: I0313 01:32:28.558868 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="43028f0e2cfc9ffb600b4d08ad84e12d" podUID="eb4d3da469837c83afa0bab0e0bd1451" Mar 13 01:32:28.600600 master-0 kubenswrapper[19170]: I0313 01:32:28.600540 19170 scope.go:117] "RemoveContainer" containerID="b13d7faceb5783d6bf9752256a4876b43fc891516b2801fcb45d996d22ee86f4" Mar 13 01:32:28.607006 master-0 kubenswrapper[19170]: I0313 01:32:28.606942 19170 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="43028f0e2cfc9ffb600b4d08ad84e12d" podUID="eb4d3da469837c83afa0bab0e0bd1451" Mar 13 01:32:29.435471 master-0 kubenswrapper[19170]: I0313 01:32:29.435417 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43028f0e2cfc9ffb600b4d08ad84e12d" path="/var/lib/kubelet/pods/43028f0e2cfc9ffb600b4d08ad84e12d/volumes" Mar 13 01:32:29.572550 master-0 kubenswrapper[19170]: I0313 01:32:29.571575 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_43028f0e2cfc9ffb600b4d08ad84e12d/kube-controller-manager-cert-syncer/0.log" Mar 13 01:32:30.017348 master-0 kubenswrapper[19170]: I0313 01:32:30.017214 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:32:30.135888 master-0 kubenswrapper[19170]: I0313 01:32:30.135814 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir\") pod \"eedaadd9-adde-40e7-adbb-9bde5930d325\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " Mar 13 01:32:30.136175 master-0 kubenswrapper[19170]: I0313 01:32:30.136050 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock\") pod \"eedaadd9-adde-40e7-adbb-9bde5930d325\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " Mar 13 01:32:30.136175 master-0 kubenswrapper[19170]: I0313 01:32:30.136158 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock" (OuterVolumeSpecName: "var-lock") pod "eedaadd9-adde-40e7-adbb-9bde5930d325" (UID: "eedaadd9-adde-40e7-adbb-9bde5930d325"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:32:30.136331 master-0 kubenswrapper[19170]: I0313 01:32:30.136150 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eedaadd9-adde-40e7-adbb-9bde5930d325" (UID: "eedaadd9-adde-40e7-adbb-9bde5930d325"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:32:30.136331 master-0 kubenswrapper[19170]: I0313 01:32:30.136274 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access\") pod \"eedaadd9-adde-40e7-adbb-9bde5930d325\" (UID: \"eedaadd9-adde-40e7-adbb-9bde5930d325\") " Mar 13 01:32:30.137097 master-0 kubenswrapper[19170]: I0313 01:32:30.137056 19170 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 01:32:30.137097 master-0 kubenswrapper[19170]: I0313 01:32:30.137085 19170 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eedaadd9-adde-40e7-adbb-9bde5930d325-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:32:30.143039 master-0 kubenswrapper[19170]: I0313 01:32:30.142937 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eedaadd9-adde-40e7-adbb-9bde5930d325" (UID: "eedaadd9-adde-40e7-adbb-9bde5930d325"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:32:30.238827 master-0 kubenswrapper[19170]: I0313 01:32:30.238753 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eedaadd9-adde-40e7-adbb-9bde5930d325-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 01:32:30.586224 master-0 kubenswrapper[19170]: I0313 01:32:30.586141 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" event={"ID":"eedaadd9-adde-40e7-adbb-9bde5930d325","Type":"ContainerDied","Data":"afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493"} Mar 13 01:32:30.587077 master-0 kubenswrapper[19170]: I0313 01:32:30.586228 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afade0bef58c40ea179adababfa42e69d24db44220c48f9afdd333711b66b493" Mar 13 01:32:30.587077 master-0 kubenswrapper[19170]: I0313 01:32:30.586344 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-retry-2-master-0" Mar 13 01:32:39.419387 master-0 kubenswrapper[19170]: I0313 01:32:39.419292 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:39.471703 master-0 kubenswrapper[19170]: I0313 01:32:39.470428 19170 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5993d636-54e0-4f1c-ae8d-e2fc842e89c8" Mar 13 01:32:39.471703 master-0 kubenswrapper[19170]: I0313 01:32:39.470476 19170 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="5993d636-54e0-4f1c-ae8d-e2fc842e89c8" Mar 13 01:32:39.501962 master-0 kubenswrapper[19170]: I0313 01:32:39.501888 19170 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:39.502981 master-0 kubenswrapper[19170]: I0313 01:32:39.502914 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:32:39.513331 master-0 kubenswrapper[19170]: I0313 01:32:39.513101 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:32:39.521496 master-0 kubenswrapper[19170]: I0313 01:32:39.521420 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:39.533590 master-0 kubenswrapper[19170]: I0313 01:32:39.533502 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 01:32:39.559399 master-0 kubenswrapper[19170]: W0313 01:32:39.559341 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4d3da469837c83afa0bab0e0bd1451.slice/crio-27d986fad4e4d0978cdf4e5ad520be0dbd835f83380e96146defe3b14420701d WatchSource:0}: Error finding container 27d986fad4e4d0978cdf4e5ad520be0dbd835f83380e96146defe3b14420701d: Status 404 returned error can't find the container with id 27d986fad4e4d0978cdf4e5ad520be0dbd835f83380e96146defe3b14420701d Mar 13 01:32:39.677886 master-0 kubenswrapper[19170]: I0313 01:32:39.677747 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"eb4d3da469837c83afa0bab0e0bd1451","Type":"ContainerStarted","Data":"27d986fad4e4d0978cdf4e5ad520be0dbd835f83380e96146defe3b14420701d"} Mar 13 01:32:40.698420 master-0 kubenswrapper[19170]: I0313 01:32:40.698346 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"eb4d3da469837c83afa0bab0e0bd1451","Type":"ContainerStarted","Data":"4d0724af4e6bf4e8b922b6e6f4ee429f014632bd9b23fbe983487902b8328b10"} Mar 13 01:32:40.698420 master-0 kubenswrapper[19170]: I0313 01:32:40.698421 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"eb4d3da469837c83afa0bab0e0bd1451","Type":"ContainerStarted","Data":"c5ac1e44763e6f2e1aae6a92078dd8422a391e7d1c0f8f8e8c4aea0c5d1d44d6"} Mar 13 01:32:40.698977 master-0 kubenswrapper[19170]: I0313 01:32:40.698435 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"eb4d3da469837c83afa0bab0e0bd1451","Type":"ContainerStarted","Data":"0b36be1db2bb327471e4b195dac6ab72e71cb41b1304c49b33cc428ff861c591"} Mar 13 01:32:41.708563 master-0 kubenswrapper[19170]: I0313 01:32:41.708515 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"eb4d3da469837c83afa0bab0e0bd1451","Type":"ContainerStarted","Data":"967f6d39ac186f8eaadc52c23278ddb836c3565c8693b6a73069e09f2e658429"} Mar 13 01:32:41.733231 master-0 kubenswrapper[19170]: I0313 01:32:41.733148 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.733123285 podStartE2EDuration="2.733123285s" podCreationTimestamp="2026-03-13 01:32:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:32:41.728142187 +0000 UTC m=+822.536263137" watchObservedRunningTime="2026-03-13 01:32:41.733123285 +0000 UTC m=+822.541244265" Mar 13 01:32:49.521946 master-0 kubenswrapper[19170]: I0313 01:32:49.521848 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.521946 master-0 kubenswrapper[19170]: I0313 01:32:49.521947 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.521946 master-0 kubenswrapper[19170]: I0313 01:32:49.521966 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.521946 master-0 kubenswrapper[19170]: I0313 01:32:49.521984 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.527066 master-0 kubenswrapper[19170]: I0313 01:32:49.526996 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.528898 master-0 kubenswrapper[19170]: I0313 01:32:49.528148 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:49.795474 master-0 kubenswrapper[19170]: I0313 01:32:49.795285 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:32:50.805712 master-0 kubenswrapper[19170]: I0313 01:32:50.805618 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 01:33:01.981628 master-0 kubenswrapper[19170]: I0313 01:33:01.981454 19170 scope.go:117] "RemoveContainer" containerID="2aa87006447655fcce0c7dda89b0736fa8fe5d6d6b3d7992f1e605fe121770e9" Mar 13 01:33:02.007990 master-0 kubenswrapper[19170]: I0313 01:33:02.007928 19170 scope.go:117] "RemoveContainer" containerID="ad501dba73c8e02ebebbb7bcaac88dd3b49aa2dc75d4621c004e56223548c569" Mar 13 01:34:02.050033 master-0 kubenswrapper[19170]: I0313 01:34:02.049951 19170 scope.go:117] "RemoveContainer" containerID="c2b7c2cb0043a7acf4d6058427bbcbbdcf69fdb50f1988ca8bbd9592ceaee922" Mar 13 01:36:55.645031 master-0 kubenswrapper[19170]: I0313 01:36:55.644948 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:36:55.646105 master-0 kubenswrapper[19170]: E0313 01:36:55.645413 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedaadd9-adde-40e7-adbb-9bde5930d325" containerName="installer" Mar 13 01:36:55.646105 master-0 kubenswrapper[19170]: I0313 01:36:55.645433 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedaadd9-adde-40e7-adbb-9bde5930d325" containerName="installer" Mar 13 01:36:55.646105 master-0 kubenswrapper[19170]: I0313 01:36:55.645613 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedaadd9-adde-40e7-adbb-9bde5930d325" containerName="installer" Mar 13 01:36:55.646322 master-0 kubenswrapper[19170]: I0313 01:36:55.646293 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.648989 master-0 kubenswrapper[19170]: I0313 01:36:55.648949 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 13 01:36:55.649138 master-0 kubenswrapper[19170]: I0313 01:36:55.648991 19170 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 13 01:36:55.649138 master-0 kubenswrapper[19170]: I0313 01:36:55.649100 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 13 01:36:55.649269 master-0 kubenswrapper[19170]: I0313 01:36:55.649218 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 13 01:36:55.666947 master-0 kubenswrapper[19170]: I0313 01:36:55.666856 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:36:55.743195 master-0 kubenswrapper[19170]: I0313 01:36:55.743097 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.744047 master-0 kubenswrapper[19170]: I0313 01:36:55.743525 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.744540 master-0 kubenswrapper[19170]: I0313 01:36:55.744123 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkw9d\" (UniqueName: \"kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.847647 master-0 kubenswrapper[19170]: I0313 01:36:55.847537 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.847872 master-0 kubenswrapper[19170]: I0313 01:36:55.847726 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkw9d\" (UniqueName: \"kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.847872 master-0 kubenswrapper[19170]: I0313 01:36:55.847788 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.849710 master-0 kubenswrapper[19170]: I0313 01:36:55.849670 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.851889 master-0 kubenswrapper[19170]: I0313 01:36:55.851839 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.875598 master-0 kubenswrapper[19170]: I0313 01:36:55.875536 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkw9d\" (UniqueName: \"kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d\") pod \"sushy-emulator-6dd6777c94-qlhhb\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:55.988556 master-0 kubenswrapper[19170]: I0313 01:36:55.988463 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:36:56.478116 master-0 kubenswrapper[19170]: I0313 01:36:56.478019 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:36:56.479601 master-0 kubenswrapper[19170]: W0313 01:36:56.479523 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d69657_4bb5_4150_9376_e37c53ec5bf2.slice/crio-628343cacd9c5dca97375e171dd357059766e14203e16e04468cbfa4fa53e281 WatchSource:0}: Error finding container 628343cacd9c5dca97375e171dd357059766e14203e16e04468cbfa4fa53e281: Status 404 returned error can't find the container with id 628343cacd9c5dca97375e171dd357059766e14203e16e04468cbfa4fa53e281 Mar 13 01:36:56.484232 master-0 kubenswrapper[19170]: I0313 01:36:56.484121 19170 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:36:56.542240 master-0 kubenswrapper[19170]: I0313 01:36:56.542144 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" event={"ID":"b3d69657-4bb5-4150-9376-e37c53ec5bf2","Type":"ContainerStarted","Data":"628343cacd9c5dca97375e171dd357059766e14203e16e04468cbfa4fa53e281"} Mar 13 01:37:02.127849 master-0 kubenswrapper[19170]: I0313 01:37:02.127744 19170 scope.go:117] "RemoveContainer" containerID="1c6a2b5cf446f4c677619876139e96d76bcf0a96b3e0347a7104ea8e63c6e987" Mar 13 01:37:03.610354 master-0 kubenswrapper[19170]: I0313 01:37:03.610245 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" event={"ID":"b3d69657-4bb5-4150-9376-e37c53ec5bf2","Type":"ContainerStarted","Data":"2f7ea09545ce9bd992ab81bb3bf154ae230cc3b93405fbd6c9d47f0a87db6c9d"} Mar 13 01:37:03.643222 master-0 kubenswrapper[19170]: I0313 01:37:03.643050 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" podStartSLOduration=2.128601184 podStartE2EDuration="8.643027981s" podCreationTimestamp="2026-03-13 01:36:55 +0000 UTC" firstStartedPulling="2026-03-13 01:36:56.483917923 +0000 UTC m=+1077.292038923" lastFinishedPulling="2026-03-13 01:37:02.99834472 +0000 UTC m=+1083.806465720" observedRunningTime="2026-03-13 01:37:03.637496325 +0000 UTC m=+1084.445617325" watchObservedRunningTime="2026-03-13 01:37:03.643027981 +0000 UTC m=+1084.451148951" Mar 13 01:37:05.989148 master-0 kubenswrapper[19170]: I0313 01:37:05.988953 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:37:05.990066 master-0 kubenswrapper[19170]: I0313 01:37:05.989207 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:37:06.003098 master-0 kubenswrapper[19170]: I0313 01:37:06.003041 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:37:06.644670 master-0 kubenswrapper[19170]: I0313 01:37:06.644557 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:37:26.329873 master-0 kubenswrapper[19170]: I0313 01:37:26.329779 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr"] Mar 13 01:37:26.331698 master-0 kubenswrapper[19170]: I0313 01:37:26.331624 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.386625 master-0 kubenswrapper[19170]: I0313 01:37:26.386563 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr"] Mar 13 01:37:26.427006 master-0 kubenswrapper[19170]: I0313 01:37:26.426929 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-os-client-config\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.427257 master-0 kubenswrapper[19170]: I0313 01:37:26.427032 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8gpk\" (UniqueName: \"kubernetes.io/projected/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-kube-api-access-d8gpk\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.528268 master-0 kubenswrapper[19170]: I0313 01:37:26.528146 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-os-client-config\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.528550 master-0 kubenswrapper[19170]: I0313 01:37:26.528266 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8gpk\" (UniqueName: \"kubernetes.io/projected/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-kube-api-access-d8gpk\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.537452 master-0 kubenswrapper[19170]: I0313 01:37:26.537421 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-os-client-config\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.549366 master-0 kubenswrapper[19170]: I0313 01:37:26.549305 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8gpk\" (UniqueName: \"kubernetes.io/projected/f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0-kube-api-access-d8gpk\") pod \"nova-console-poller-5c79d7cfd7-cc7qr\" (UID: \"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0\") " pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:26.689493 master-0 kubenswrapper[19170]: I0313 01:37:26.689341 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" Mar 13 01:37:27.188029 master-0 kubenswrapper[19170]: I0313 01:37:27.187965 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr"] Mar 13 01:37:27.868579 master-0 kubenswrapper[19170]: I0313 01:37:27.868471 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" event={"ID":"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0","Type":"ContainerStarted","Data":"9c1d89850bf3a21255b13f9efe2f66163b0fd22b35dd56bf73cffe6a474e1089"} Mar 13 01:37:32.921579 master-0 kubenswrapper[19170]: I0313 01:37:32.921512 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" event={"ID":"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0","Type":"ContainerStarted","Data":"87aafb58203958d4ea012cada09a6440c600ab71a27cedbf53dbc010bf379caf"} Mar 13 01:37:32.921579 master-0 kubenswrapper[19170]: I0313 01:37:32.921584 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" event={"ID":"f19e77a0-dfb9-4a4c-a16b-6dbd2896acd0","Type":"ContainerStarted","Data":"c0667d3b34f75c62520180a3818dc2559ad4a94683ea509c5f3601a38de8ca0d"} Mar 13 01:37:32.955692 master-0 kubenswrapper[19170]: I0313 01:37:32.955562 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-5c79d7cfd7-cc7qr" podStartSLOduration=1.777745911 podStartE2EDuration="6.955539259s" podCreationTimestamp="2026-03-13 01:37:26 +0000 UTC" firstStartedPulling="2026-03-13 01:37:27.193834034 +0000 UTC m=+1108.001955044" lastFinishedPulling="2026-03-13 01:37:32.371627392 +0000 UTC m=+1113.179748392" observedRunningTime="2026-03-13 01:37:32.945169727 +0000 UTC m=+1113.753290737" watchObservedRunningTime="2026-03-13 01:37:32.955539259 +0000 UTC m=+1113.763660239" Mar 13 01:37:57.698075 master-0 kubenswrapper[19170]: I0313 01:37:57.697988 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6"] Mar 13 01:37:57.701294 master-0 kubenswrapper[19170]: I0313 01:37:57.701193 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.727128 master-0 kubenswrapper[19170]: I0313 01:37:57.725816 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-os-client-config\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.730940 master-0 kubenswrapper[19170]: I0313 01:37:57.730893 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6"] Mar 13 01:37:57.827164 master-0 kubenswrapper[19170]: I0313 01:37:57.827084 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-os-client-config\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.827164 master-0 kubenswrapper[19170]: I0313 01:37:57.827149 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbhgz\" (UniqueName: \"kubernetes.io/projected/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-kube-api-access-jbhgz\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.827498 master-0 kubenswrapper[19170]: I0313 01:37:57.827198 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-nova-console-recordings-pv\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.833215 master-0 kubenswrapper[19170]: I0313 01:37:57.833104 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-os-client-config\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.928791 master-0 kubenswrapper[19170]: I0313 01:37:57.928730 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbhgz\" (UniqueName: \"kubernetes.io/projected/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-kube-api-access-jbhgz\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.929025 master-0 kubenswrapper[19170]: I0313 01:37:57.928808 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-nova-console-recordings-pv\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:57.945296 master-0 kubenswrapper[19170]: I0313 01:37:57.945236 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbhgz\" (UniqueName: \"kubernetes.io/projected/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-kube-api-access-jbhgz\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:58.592066 master-0 kubenswrapper[19170]: I0313 01:37:58.591987 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6-nova-console-recordings-pv\") pod \"nova-console-recorder-c576fb5c5-bjcz6\" (UID: \"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6\") " pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:58.654074 master-0 kubenswrapper[19170]: I0313 01:37:58.653984 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" Mar 13 01:37:59.168007 master-0 kubenswrapper[19170]: I0313 01:37:59.167904 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6"] Mar 13 01:38:00.173996 master-0 kubenswrapper[19170]: I0313 01:38:00.173898 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" event={"ID":"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6","Type":"ContainerStarted","Data":"d7ea53aa1794588b4a491d8a49a3d3887cef19b18e25b4dc77f9439d1dd68ae9"} Mar 13 01:38:08.258122 master-0 kubenswrapper[19170]: I0313 01:38:08.256240 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" event={"ID":"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6","Type":"ContainerStarted","Data":"4d1c2f6a73229285599ef717555ef05c6997f6a157720e00af3ba29c2c2e54fd"} Mar 13 01:38:08.258122 master-0 kubenswrapper[19170]: I0313 01:38:08.256311 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" event={"ID":"c90f0ca5-8c7a-4d97-9e7a-a7497e398cc6","Type":"ContainerStarted","Data":"d8613fa1f366c40950c8d3a1257d0719492c9085be950fa7c8b7eec1e09d82e2"} Mar 13 01:38:08.286774 master-0 kubenswrapper[19170]: I0313 01:38:08.286617 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-c576fb5c5-bjcz6" podStartSLOduration=2.53657176 podStartE2EDuration="11.286599162s" podCreationTimestamp="2026-03-13 01:37:57 +0000 UTC" firstStartedPulling="2026-03-13 01:37:59.178117563 +0000 UTC m=+1139.986238523" lastFinishedPulling="2026-03-13 01:38:07.928144935 +0000 UTC m=+1148.736265925" observedRunningTime="2026-03-13 01:38:08.284912637 +0000 UTC m=+1149.093033657" watchObservedRunningTime="2026-03-13 01:38:08.286599162 +0000 UTC m=+1149.094720132" Mar 13 01:38:36.716881 master-0 kubenswrapper[19170]: I0313 01:38:36.716799 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9"] Mar 13 01:38:36.720051 master-0 kubenswrapper[19170]: I0313 01:38:36.719991 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.736304 master-0 kubenswrapper[19170]: I0313 01:38:36.736256 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9"] Mar 13 01:38:36.832167 master-0 kubenswrapper[19170]: I0313 01:38:36.831828 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.832167 master-0 kubenswrapper[19170]: I0313 01:38:36.831931 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.832167 master-0 kubenswrapper[19170]: I0313 01:38:36.832022 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbjnj\" (UniqueName: \"kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.933750 master-0 kubenswrapper[19170]: I0313 01:38:36.933678 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.933982 master-0 kubenswrapper[19170]: I0313 01:38:36.933870 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbjnj\" (UniqueName: \"kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.934575 master-0 kubenswrapper[19170]: I0313 01:38:36.934507 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.934877 master-0 kubenswrapper[19170]: I0313 01:38:36.934799 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.935424 master-0 kubenswrapper[19170]: I0313 01:38:36.935374 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:36.967686 master-0 kubenswrapper[19170]: I0313 01:38:36.967524 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbjnj\" (UniqueName: \"kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:37.047287 master-0 kubenswrapper[19170]: I0313 01:38:37.047224 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:37.519890 master-0 kubenswrapper[19170]: I0313 01:38:37.519819 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9"] Mar 13 01:38:37.521178 master-0 kubenswrapper[19170]: W0313 01:38:37.521072 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4215005b_54d9_41cc_a2d2_303f4ae363a6.slice/crio-ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112 WatchSource:0}: Error finding container ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112: Status 404 returned error can't find the container with id ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112 Mar 13 01:38:37.531835 master-0 kubenswrapper[19170]: I0313 01:38:37.531780 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" event={"ID":"4215005b-54d9-41cc-a2d2-303f4ae363a6","Type":"ContainerStarted","Data":"ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112"} Mar 13 01:38:38.540694 master-0 kubenswrapper[19170]: I0313 01:38:38.540616 19170 generic.go:334] "Generic (PLEG): container finished" podID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerID="e9574858086534927a4449335ed0e836592ef8b52e891609a55742cb3dc9fa7d" exitCode=0 Mar 13 01:38:38.540694 master-0 kubenswrapper[19170]: I0313 01:38:38.540695 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" event={"ID":"4215005b-54d9-41cc-a2d2-303f4ae363a6","Type":"ContainerDied","Data":"e9574858086534927a4449335ed0e836592ef8b52e891609a55742cb3dc9fa7d"} Mar 13 01:38:40.577875 master-0 kubenswrapper[19170]: I0313 01:38:40.577780 19170 generic.go:334] "Generic (PLEG): container finished" podID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerID="eaa788d4b15fa5e159a3f5facee4c89167aa9c1c0267fcf706a61bf87b08bd70" exitCode=0 Mar 13 01:38:40.577875 master-0 kubenswrapper[19170]: I0313 01:38:40.577848 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" event={"ID":"4215005b-54d9-41cc-a2d2-303f4ae363a6","Type":"ContainerDied","Data":"eaa788d4b15fa5e159a3f5facee4c89167aa9c1c0267fcf706a61bf87b08bd70"} Mar 13 01:38:41.600940 master-0 kubenswrapper[19170]: I0313 01:38:41.600808 19170 generic.go:334] "Generic (PLEG): container finished" podID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerID="553dde91ef08017f79c065d9f1ec4c8d8f796640268316bd92feeb34b656c41c" exitCode=0 Mar 13 01:38:41.602089 master-0 kubenswrapper[19170]: I0313 01:38:41.600940 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" event={"ID":"4215005b-54d9-41cc-a2d2-303f4ae363a6","Type":"ContainerDied","Data":"553dde91ef08017f79c065d9f1ec4c8d8f796640268316bd92feeb34b656c41c"} Mar 13 01:38:43.005915 master-0 kubenswrapper[19170]: I0313 01:38:43.005840 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:43.177349 master-0 kubenswrapper[19170]: I0313 01:38:43.177278 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbjnj\" (UniqueName: \"kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj\") pod \"4215005b-54d9-41cc-a2d2-303f4ae363a6\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " Mar 13 01:38:43.177624 master-0 kubenswrapper[19170]: I0313 01:38:43.177581 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle\") pod \"4215005b-54d9-41cc-a2d2-303f4ae363a6\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " Mar 13 01:38:43.180777 master-0 kubenswrapper[19170]: I0313 01:38:43.177704 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util\") pod \"4215005b-54d9-41cc-a2d2-303f4ae363a6\" (UID: \"4215005b-54d9-41cc-a2d2-303f4ae363a6\") " Mar 13 01:38:43.180777 master-0 kubenswrapper[19170]: I0313 01:38:43.178816 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle" (OuterVolumeSpecName: "bundle") pod "4215005b-54d9-41cc-a2d2-303f4ae363a6" (UID: "4215005b-54d9-41cc-a2d2-303f4ae363a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:38:43.186890 master-0 kubenswrapper[19170]: I0313 01:38:43.186821 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj" (OuterVolumeSpecName: "kube-api-access-lbjnj") pod "4215005b-54d9-41cc-a2d2-303f4ae363a6" (UID: "4215005b-54d9-41cc-a2d2-303f4ae363a6"). InnerVolumeSpecName "kube-api-access-lbjnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:38:43.198482 master-0 kubenswrapper[19170]: I0313 01:38:43.198319 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util" (OuterVolumeSpecName: "util") pod "4215005b-54d9-41cc-a2d2-303f4ae363a6" (UID: "4215005b-54d9-41cc-a2d2-303f4ae363a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:38:43.280302 master-0 kubenswrapper[19170]: I0313 01:38:43.280215 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:38:43.280302 master-0 kubenswrapper[19170]: I0313 01:38:43.280268 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4215005b-54d9-41cc-a2d2-303f4ae363a6-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:38:43.280302 master-0 kubenswrapper[19170]: I0313 01:38:43.280283 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbjnj\" (UniqueName: \"kubernetes.io/projected/4215005b-54d9-41cc-a2d2-303f4ae363a6-kube-api-access-lbjnj\") on node \"master-0\" DevicePath \"\"" Mar 13 01:38:43.624702 master-0 kubenswrapper[19170]: I0313 01:38:43.624558 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" event={"ID":"4215005b-54d9-41cc-a2d2-303f4ae363a6","Type":"ContainerDied","Data":"ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112"} Mar 13 01:38:43.624702 master-0 kubenswrapper[19170]: I0313 01:38:43.624675 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4m8jl9" Mar 13 01:38:43.625083 master-0 kubenswrapper[19170]: I0313 01:38:43.624693 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea9bb84f19f8f3ef1488c7658f2efc3bdc0089a86aa9d9a5056387aca9511112" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: I0313 01:38:50.187473 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-565567cb8b-9th62"] Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: E0313 01:38:50.187743 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="extract" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: I0313 01:38:50.187756 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="extract" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: E0313 01:38:50.187768 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="pull" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: I0313 01:38:50.187774 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="pull" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: E0313 01:38:50.187803 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="util" Mar 13 01:38:50.187788 master-0 kubenswrapper[19170]: I0313 01:38:50.187809 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="util" Mar 13 01:38:50.188706 master-0 kubenswrapper[19170]: I0313 01:38:50.187920 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4215005b-54d9-41cc-a2d2-303f4ae363a6" containerName="extract" Mar 13 01:38:50.188706 master-0 kubenswrapper[19170]: I0313 01:38:50.188350 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.191169 master-0 kubenswrapper[19170]: I0313 01:38:50.191137 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 13 01:38:50.191369 master-0 kubenswrapper[19170]: I0313 01:38:50.191342 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 13 01:38:50.191547 master-0 kubenswrapper[19170]: I0313 01:38:50.191521 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 13 01:38:50.191701 master-0 kubenswrapper[19170]: I0313 01:38:50.191674 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 13 01:38:50.191795 master-0 kubenswrapper[19170]: I0313 01:38:50.191776 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 13 01:38:50.197784 master-0 kubenswrapper[19170]: I0313 01:38:50.196549 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr59b\" (UniqueName: \"kubernetes.io/projected/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-kube-api-access-rr59b\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.197784 master-0 kubenswrapper[19170]: I0313 01:38:50.196602 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-socket-dir\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.197784 master-0 kubenswrapper[19170]: I0313 01:38:50.196623 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-apiservice-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.197784 master-0 kubenswrapper[19170]: I0313 01:38:50.196681 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-webhook-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.197784 master-0 kubenswrapper[19170]: I0313 01:38:50.196703 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-metrics-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.218521 master-0 kubenswrapper[19170]: I0313 01:38:50.218471 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-565567cb8b-9th62"] Mar 13 01:38:50.298452 master-0 kubenswrapper[19170]: I0313 01:38:50.298396 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-socket-dir\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.298452 master-0 kubenswrapper[19170]: I0313 01:38:50.298445 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-apiservice-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.298699 master-0 kubenswrapper[19170]: I0313 01:38:50.298497 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-webhook-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.298699 master-0 kubenswrapper[19170]: I0313 01:38:50.298523 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-metrics-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.298977 master-0 kubenswrapper[19170]: I0313 01:38:50.298764 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr59b\" (UniqueName: \"kubernetes.io/projected/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-kube-api-access-rr59b\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.300246 master-0 kubenswrapper[19170]: I0313 01:38:50.299793 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-socket-dir\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.302448 master-0 kubenswrapper[19170]: I0313 01:38:50.302413 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-webhook-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.305167 master-0 kubenswrapper[19170]: I0313 01:38:50.303821 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-metrics-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.305167 master-0 kubenswrapper[19170]: I0313 01:38:50.304033 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-apiservice-cert\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.321131 master-0 kubenswrapper[19170]: I0313 01:38:50.321091 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr59b\" (UniqueName: \"kubernetes.io/projected/2ee7bba9-b98a-40dc-81bf-b49c223bb04d-kube-api-access-rr59b\") pod \"lvms-operator-565567cb8b-9th62\" (UID: \"2ee7bba9-b98a-40dc-81bf-b49c223bb04d\") " pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.517471 master-0 kubenswrapper[19170]: I0313 01:38:50.517341 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:50.994693 master-0 kubenswrapper[19170]: I0313 01:38:50.988740 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-565567cb8b-9th62"] Mar 13 01:38:50.994693 master-0 kubenswrapper[19170]: W0313 01:38:50.990119 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee7bba9_b98a_40dc_81bf_b49c223bb04d.slice/crio-2859bb62fd7141df7c7a4b2b3d6a54fa54e054e3177fa3db8abafb12c2e16b27 WatchSource:0}: Error finding container 2859bb62fd7141df7c7a4b2b3d6a54fa54e054e3177fa3db8abafb12c2e16b27: Status 404 returned error can't find the container with id 2859bb62fd7141df7c7a4b2b3d6a54fa54e054e3177fa3db8abafb12c2e16b27 Mar 13 01:38:51.687764 master-0 kubenswrapper[19170]: I0313 01:38:51.687701 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-565567cb8b-9th62" event={"ID":"2ee7bba9-b98a-40dc-81bf-b49c223bb04d","Type":"ContainerStarted","Data":"2859bb62fd7141df7c7a4b2b3d6a54fa54e054e3177fa3db8abafb12c2e16b27"} Mar 13 01:38:56.730660 master-0 kubenswrapper[19170]: I0313 01:38:56.730578 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-565567cb8b-9th62" event={"ID":"2ee7bba9-b98a-40dc-81bf-b49c223bb04d","Type":"ContainerStarted","Data":"cda1f7122eaad1c2d73b0b5cb024333b2f047767840f42ec86f39b6231ef28a2"} Mar 13 01:38:56.731173 master-0 kubenswrapper[19170]: I0313 01:38:56.730951 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:56.737098 master-0 kubenswrapper[19170]: I0313 01:38:56.737047 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-565567cb8b-9th62" Mar 13 01:38:56.787915 master-0 kubenswrapper[19170]: I0313 01:38:56.787558 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-565567cb8b-9th62" podStartSLOduration=2.176783802 podStartE2EDuration="6.787525828s" podCreationTimestamp="2026-03-13 01:38:50 +0000 UTC" firstStartedPulling="2026-03-13 01:38:50.993228174 +0000 UTC m=+1191.801349174" lastFinishedPulling="2026-03-13 01:38:55.60397021 +0000 UTC m=+1196.412091200" observedRunningTime="2026-03-13 01:38:56.763881334 +0000 UTC m=+1197.572002324" watchObservedRunningTime="2026-03-13 01:38:56.787525828 +0000 UTC m=+1197.595646808" Mar 13 01:39:00.727401 master-0 kubenswrapper[19170]: I0313 01:39:00.727259 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl"] Mar 13 01:39:00.730147 master-0 kubenswrapper[19170]: I0313 01:39:00.730102 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.752382 master-0 kubenswrapper[19170]: I0313 01:39:00.752324 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl"] Mar 13 01:39:00.823471 master-0 kubenswrapper[19170]: I0313 01:39:00.823371 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.823697 master-0 kubenswrapper[19170]: I0313 01:39:00.823590 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.823858 master-0 kubenswrapper[19170]: I0313 01:39:00.823707 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5zzf\" (UniqueName: \"kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.929481 master-0 kubenswrapper[19170]: I0313 01:39:00.928044 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.929481 master-0 kubenswrapper[19170]: I0313 01:39:00.928380 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5zzf\" (UniqueName: \"kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.929481 master-0 kubenswrapper[19170]: I0313 01:39:00.928617 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.929481 master-0 kubenswrapper[19170]: I0313 01:39:00.928940 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.929481 master-0 kubenswrapper[19170]: I0313 01:39:00.929312 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:00.951674 master-0 kubenswrapper[19170]: I0313 01:39:00.951616 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5zzf\" (UniqueName: \"kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:01.065310 master-0 kubenswrapper[19170]: I0313 01:39:01.065201 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:01.072612 master-0 kubenswrapper[19170]: I0313 01:39:01.072555 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd"] Mar 13 01:39:01.074950 master-0 kubenswrapper[19170]: I0313 01:39:01.074913 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.087218 master-0 kubenswrapper[19170]: I0313 01:39:01.087108 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd"] Mar 13 01:39:01.134828 master-0 kubenswrapper[19170]: I0313 01:39:01.134660 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4wr\" (UniqueName: \"kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.135033 master-0 kubenswrapper[19170]: I0313 01:39:01.134848 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.135096 master-0 kubenswrapper[19170]: I0313 01:39:01.135070 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.236522 master-0 kubenswrapper[19170]: I0313 01:39:01.236456 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4wr\" (UniqueName: \"kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.236910 master-0 kubenswrapper[19170]: I0313 01:39:01.236740 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.236910 master-0 kubenswrapper[19170]: I0313 01:39:01.236838 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.237397 master-0 kubenswrapper[19170]: I0313 01:39:01.237351 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.237671 master-0 kubenswrapper[19170]: I0313 01:39:01.237605 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.256370 master-0 kubenswrapper[19170]: I0313 01:39:01.255204 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4wr\" (UniqueName: \"kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.511609 master-0 kubenswrapper[19170]: I0313 01:39:01.511537 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:01.544226 master-0 kubenswrapper[19170]: I0313 01:39:01.544120 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl"] Mar 13 01:39:01.553012 master-0 kubenswrapper[19170]: W0313 01:39:01.552953 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c0ac249_cb25_4ef4_bcf5_2f885b8526d8.slice/crio-423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936 WatchSource:0}: Error finding container 423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936: Status 404 returned error can't find the container with id 423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936 Mar 13 01:39:01.782700 master-0 kubenswrapper[19170]: I0313 01:39:01.782647 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerStarted","Data":"7e1a1f520c24e2054cf87e1e42b9db02b3547e6a503a74cb42e5a447021eb417"} Mar 13 01:39:01.782700 master-0 kubenswrapper[19170]: I0313 01:39:01.782696 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerStarted","Data":"423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936"} Mar 13 01:39:01.981248 master-0 kubenswrapper[19170]: I0313 01:39:01.981183 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd"] Mar 13 01:39:01.985328 master-0 kubenswrapper[19170]: W0313 01:39:01.985248 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234c6c99_6fdd_4d9e_ab08_b510a0ec4097.slice/crio-f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa WatchSource:0}: Error finding container f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa: Status 404 returned error can't find the container with id f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa Mar 13 01:39:02.803286 master-0 kubenswrapper[19170]: I0313 01:39:02.803189 19170 generic.go:334] "Generic (PLEG): container finished" podID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerID="e30dfd39a861e747992dcf0683f81c2403f2f1271a08dac8a0cdff29338f8fa4" exitCode=0 Mar 13 01:39:02.804102 master-0 kubenswrapper[19170]: I0313 01:39:02.803273 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" event={"ID":"234c6c99-6fdd-4d9e-ab08-b510a0ec4097","Type":"ContainerDied","Data":"e30dfd39a861e747992dcf0683f81c2403f2f1271a08dac8a0cdff29338f8fa4"} Mar 13 01:39:02.804102 master-0 kubenswrapper[19170]: I0313 01:39:02.803384 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" event={"ID":"234c6c99-6fdd-4d9e-ab08-b510a0ec4097","Type":"ContainerStarted","Data":"f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa"} Mar 13 01:39:02.810178 master-0 kubenswrapper[19170]: I0313 01:39:02.810110 19170 generic.go:334] "Generic (PLEG): container finished" podID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerID="7e1a1f520c24e2054cf87e1e42b9db02b3547e6a503a74cb42e5a447021eb417" exitCode=0 Mar 13 01:39:02.810363 master-0 kubenswrapper[19170]: I0313 01:39:02.810179 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerDied","Data":"7e1a1f520c24e2054cf87e1e42b9db02b3547e6a503a74cb42e5a447021eb417"} Mar 13 01:39:02.919972 master-0 kubenswrapper[19170]: I0313 01:39:02.919888 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr"] Mar 13 01:39:02.922891 master-0 kubenswrapper[19170]: I0313 01:39:02.922621 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:02.935474 master-0 kubenswrapper[19170]: I0313 01:39:02.935354 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr"] Mar 13 01:39:02.973582 master-0 kubenswrapper[19170]: I0313 01:39:02.973525 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqdk\" (UniqueName: \"kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:02.973992 master-0 kubenswrapper[19170]: I0313 01:39:02.973958 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:02.974392 master-0 kubenswrapper[19170]: I0313 01:39:02.974363 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.076484 master-0 kubenswrapper[19170]: I0313 01:39:03.076347 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqdk\" (UniqueName: \"kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.076925 master-0 kubenswrapper[19170]: I0313 01:39:03.076888 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.077246 master-0 kubenswrapper[19170]: I0313 01:39:03.077217 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.077891 master-0 kubenswrapper[19170]: I0313 01:39:03.077824 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.077891 master-0 kubenswrapper[19170]: I0313 01:39:03.077880 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.104616 master-0 kubenswrapper[19170]: I0313 01:39:03.104534 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqdk\" (UniqueName: \"kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.251339 master-0 kubenswrapper[19170]: I0313 01:39:03.251304 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:03.749009 master-0 kubenswrapper[19170]: I0313 01:39:03.748953 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr"] Mar 13 01:39:03.751923 master-0 kubenswrapper[19170]: W0313 01:39:03.751861 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2121dd_1b58_47d4_b301_1a3456776032.slice/crio-aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1 WatchSource:0}: Error finding container aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1: Status 404 returned error can't find the container with id aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1 Mar 13 01:39:03.828168 master-0 kubenswrapper[19170]: I0313 01:39:03.828029 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" event={"ID":"7f2121dd-1b58-47d4-b301-1a3456776032","Type":"ContainerStarted","Data":"aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1"} Mar 13 01:39:04.836762 master-0 kubenswrapper[19170]: I0313 01:39:04.836408 19170 generic.go:334] "Generic (PLEG): container finished" podID="7f2121dd-1b58-47d4-b301-1a3456776032" containerID="fc9ad94d94d6602c49fac5011f2ade0ec83c9451f187253911c8b6f301e333c8" exitCode=0 Mar 13 01:39:04.836762 master-0 kubenswrapper[19170]: I0313 01:39:04.836514 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" event={"ID":"7f2121dd-1b58-47d4-b301-1a3456776032","Type":"ContainerDied","Data":"fc9ad94d94d6602c49fac5011f2ade0ec83c9451f187253911c8b6f301e333c8"} Mar 13 01:39:04.839163 master-0 kubenswrapper[19170]: I0313 01:39:04.839102 19170 generic.go:334] "Generic (PLEG): container finished" podID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerID="8e5851841116af78b1502fd30163a5b823bce66cbe6d97bca9936318fe27520f" exitCode=0 Mar 13 01:39:04.839163 master-0 kubenswrapper[19170]: I0313 01:39:04.839151 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" event={"ID":"234c6c99-6fdd-4d9e-ab08-b510a0ec4097","Type":"ContainerDied","Data":"8e5851841116af78b1502fd30163a5b823bce66cbe6d97bca9936318fe27520f"} Mar 13 01:39:07.876398 master-0 kubenswrapper[19170]: I0313 01:39:07.876294 19170 generic.go:334] "Generic (PLEG): container finished" podID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerID="44aa0012d753ec84356de3edc7ce92dd67743323ecc82def9bb391ed7a0038ae" exitCode=0 Mar 13 01:39:07.876398 master-0 kubenswrapper[19170]: I0313 01:39:07.876353 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerDied","Data":"44aa0012d753ec84356de3edc7ce92dd67743323ecc82def9bb391ed7a0038ae"} Mar 13 01:39:07.881960 master-0 kubenswrapper[19170]: I0313 01:39:07.881887 19170 generic.go:334] "Generic (PLEG): container finished" podID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerID="52d04a0fcd9bef60fa1a6da0feb00e12ef788ffd138b87a99aa1e5a886b54cf1" exitCode=0 Mar 13 01:39:07.882194 master-0 kubenswrapper[19170]: I0313 01:39:07.882096 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" event={"ID":"234c6c99-6fdd-4d9e-ab08-b510a0ec4097","Type":"ContainerDied","Data":"52d04a0fcd9bef60fa1a6da0feb00e12ef788ffd138b87a99aa1e5a886b54cf1"} Mar 13 01:39:07.886270 master-0 kubenswrapper[19170]: I0313 01:39:07.886210 19170 generic.go:334] "Generic (PLEG): container finished" podID="7f2121dd-1b58-47d4-b301-1a3456776032" containerID="102d30225a52a347126421ce5878d851a01e1593fca4cf495bda37befc798ee1" exitCode=0 Mar 13 01:39:07.886876 master-0 kubenswrapper[19170]: I0313 01:39:07.886273 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" event={"ID":"7f2121dd-1b58-47d4-b301-1a3456776032","Type":"ContainerDied","Data":"102d30225a52a347126421ce5878d851a01e1593fca4cf495bda37befc798ee1"} Mar 13 01:39:08.899333 master-0 kubenswrapper[19170]: I0313 01:39:08.899265 19170 generic.go:334] "Generic (PLEG): container finished" podID="7f2121dd-1b58-47d4-b301-1a3456776032" containerID="91313feea6ae5d1cfd3dd933dfc950cfe370faf2b3b41b122d22a4d1df8e27f7" exitCode=0 Mar 13 01:39:08.899333 master-0 kubenswrapper[19170]: I0313 01:39:08.899340 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" event={"ID":"7f2121dd-1b58-47d4-b301-1a3456776032","Type":"ContainerDied","Data":"91313feea6ae5d1cfd3dd933dfc950cfe370faf2b3b41b122d22a4d1df8e27f7"} Mar 13 01:39:08.903332 master-0 kubenswrapper[19170]: I0313 01:39:08.903264 19170 generic.go:334] "Generic (PLEG): container finished" podID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerID="16ebc787d710675b2a540745025ce3902abcfbe4ef1799e39c36d127eb7c74ce" exitCode=0 Mar 13 01:39:08.903415 master-0 kubenswrapper[19170]: I0313 01:39:08.903324 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerDied","Data":"16ebc787d710675b2a540745025ce3902abcfbe4ef1799e39c36d127eb7c74ce"} Mar 13 01:39:09.305039 master-0 kubenswrapper[19170]: I0313 01:39:09.304677 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:09.406707 master-0 kubenswrapper[19170]: I0313 01:39:09.394187 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util\") pod \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " Mar 13 01:39:09.406707 master-0 kubenswrapper[19170]: I0313 01:39:09.394261 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle\") pod \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " Mar 13 01:39:09.406707 master-0 kubenswrapper[19170]: I0313 01:39:09.394441 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c4wr\" (UniqueName: \"kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr\") pod \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\" (UID: \"234c6c99-6fdd-4d9e-ab08-b510a0ec4097\") " Mar 13 01:39:09.406707 master-0 kubenswrapper[19170]: I0313 01:39:09.398329 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr" (OuterVolumeSpecName: "kube-api-access-5c4wr") pod "234c6c99-6fdd-4d9e-ab08-b510a0ec4097" (UID: "234c6c99-6fdd-4d9e-ab08-b510a0ec4097"). InnerVolumeSpecName "kube-api-access-5c4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:39:09.406707 master-0 kubenswrapper[19170]: I0313 01:39:09.399169 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle" (OuterVolumeSpecName: "bundle") pod "234c6c99-6fdd-4d9e-ab08-b510a0ec4097" (UID: "234c6c99-6fdd-4d9e-ab08-b510a0ec4097"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:09.420922 master-0 kubenswrapper[19170]: I0313 01:39:09.415719 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util" (OuterVolumeSpecName: "util") pod "234c6c99-6fdd-4d9e-ab08-b510a0ec4097" (UID: "234c6c99-6fdd-4d9e-ab08-b510a0ec4097"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:09.481131 master-0 kubenswrapper[19170]: I0313 01:39:09.480754 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b"] Mar 13 01:39:09.481609 master-0 kubenswrapper[19170]: E0313 01:39:09.481563 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="util" Mar 13 01:39:09.481609 master-0 kubenswrapper[19170]: I0313 01:39:09.481589 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="util" Mar 13 01:39:09.481609 master-0 kubenswrapper[19170]: E0313 01:39:09.481623 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="extract" Mar 13 01:39:09.481609 master-0 kubenswrapper[19170]: I0313 01:39:09.481647 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="extract" Mar 13 01:39:09.482084 master-0 kubenswrapper[19170]: E0313 01:39:09.481668 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="pull" Mar 13 01:39:09.482084 master-0 kubenswrapper[19170]: I0313 01:39:09.481678 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="pull" Mar 13 01:39:09.482332 master-0 kubenswrapper[19170]: I0313 01:39:09.482168 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="234c6c99-6fdd-4d9e-ab08-b510a0ec4097" containerName="extract" Mar 13 01:39:09.487452 master-0 kubenswrapper[19170]: I0313 01:39:09.487404 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.490053 master-0 kubenswrapper[19170]: I0313 01:39:09.490019 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b"] Mar 13 01:39:09.498131 master-0 kubenswrapper[19170]: I0313 01:39:09.498087 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c4wr\" (UniqueName: \"kubernetes.io/projected/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-kube-api-access-5c4wr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:09.498131 master-0 kubenswrapper[19170]: I0313 01:39:09.498134 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:09.498298 master-0 kubenswrapper[19170]: I0313 01:39:09.498149 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/234c6c99-6fdd-4d9e-ab08-b510a0ec4097-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:09.599940 master-0 kubenswrapper[19170]: I0313 01:39:09.599811 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.600228 master-0 kubenswrapper[19170]: I0313 01:39:09.600208 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.600419 master-0 kubenswrapper[19170]: I0313 01:39:09.600400 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxbmq\" (UniqueName: \"kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.702366 master-0 kubenswrapper[19170]: I0313 01:39:09.702311 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.702684 master-0 kubenswrapper[19170]: I0313 01:39:09.702660 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.702816 master-0 kubenswrapper[19170]: I0313 01:39:09.702801 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxbmq\" (UniqueName: \"kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.703554 master-0 kubenswrapper[19170]: I0313 01:39:09.703475 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.703885 master-0 kubenswrapper[19170]: I0313 01:39:09.703819 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.733465 master-0 kubenswrapper[19170]: I0313 01:39:09.733397 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxbmq\" (UniqueName: \"kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.815671 master-0 kubenswrapper[19170]: I0313 01:39:09.815543 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:09.919399 master-0 kubenswrapper[19170]: I0313 01:39:09.919349 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" Mar 13 01:39:09.920017 master-0 kubenswrapper[19170]: I0313 01:39:09.919985 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1cfrqd" event={"ID":"234c6c99-6fdd-4d9e-ab08-b510a0ec4097","Type":"ContainerDied","Data":"f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa"} Mar 13 01:39:09.920017 master-0 kubenswrapper[19170]: I0313 01:39:09.920010 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f513df0658ebc6a404bc2156da91748c2c8612919b2d49150fecf48300ef2cfa" Mar 13 01:39:10.341117 master-0 kubenswrapper[19170]: I0313 01:39:10.340367 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b"] Mar 13 01:39:10.483444 master-0 kubenswrapper[19170]: I0313 01:39:10.483391 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:10.490859 master-0 kubenswrapper[19170]: I0313 01:39:10.490828 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:10.516440 master-0 kubenswrapper[19170]: I0313 01:39:10.516390 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5zzf\" (UniqueName: \"kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf\") pod \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " Mar 13 01:39:10.516643 master-0 kubenswrapper[19170]: I0313 01:39:10.516486 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kqdk\" (UniqueName: \"kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk\") pod \"7f2121dd-1b58-47d4-b301-1a3456776032\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " Mar 13 01:39:10.516643 master-0 kubenswrapper[19170]: I0313 01:39:10.516598 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util\") pod \"7f2121dd-1b58-47d4-b301-1a3456776032\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " Mar 13 01:39:10.516762 master-0 kubenswrapper[19170]: I0313 01:39:10.516648 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle\") pod \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " Mar 13 01:39:10.516836 master-0 kubenswrapper[19170]: I0313 01:39:10.516761 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util\") pod \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\" (UID: \"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8\") " Mar 13 01:39:10.516836 master-0 kubenswrapper[19170]: I0313 01:39:10.516784 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle\") pod \"7f2121dd-1b58-47d4-b301-1a3456776032\" (UID: \"7f2121dd-1b58-47d4-b301-1a3456776032\") " Mar 13 01:39:10.517869 master-0 kubenswrapper[19170]: I0313 01:39:10.517828 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle" (OuterVolumeSpecName: "bundle") pod "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" (UID: "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:10.519928 master-0 kubenswrapper[19170]: I0313 01:39:10.519872 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle" (OuterVolumeSpecName: "bundle") pod "7f2121dd-1b58-47d4-b301-1a3456776032" (UID: "7f2121dd-1b58-47d4-b301-1a3456776032"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:10.532368 master-0 kubenswrapper[19170]: I0313 01:39:10.532291 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util" (OuterVolumeSpecName: "util") pod "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" (UID: "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:10.533763 master-0 kubenswrapper[19170]: I0313 01:39:10.533725 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util" (OuterVolumeSpecName: "util") pod "7f2121dd-1b58-47d4-b301-1a3456776032" (UID: "7f2121dd-1b58-47d4-b301-1a3456776032"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:10.575859 master-0 kubenswrapper[19170]: I0313 01:39:10.570828 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk" (OuterVolumeSpecName: "kube-api-access-5kqdk") pod "7f2121dd-1b58-47d4-b301-1a3456776032" (UID: "7f2121dd-1b58-47d4-b301-1a3456776032"). InnerVolumeSpecName "kube-api-access-5kqdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:39:10.575859 master-0 kubenswrapper[19170]: I0313 01:39:10.570861 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf" (OuterVolumeSpecName: "kube-api-access-r5zzf") pod "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" (UID: "5c0ac249-cb25-4ef4-bcf5-2f885b8526d8"). InnerVolumeSpecName "kube-api-access-r5zzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618547 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618590 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618660 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5zzf\" (UniqueName: \"kubernetes.io/projected/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-kube-api-access-r5zzf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618682 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kqdk\" (UniqueName: \"kubernetes.io/projected/7f2121dd-1b58-47d4-b301-1a3456776032-kube-api-access-5kqdk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618695 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7f2121dd-1b58-47d4-b301-1a3456776032-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.620414 master-0 kubenswrapper[19170]: I0313 01:39:10.618709 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c0ac249-cb25-4ef4-bcf5-2f885b8526d8-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:10.931230 master-0 kubenswrapper[19170]: I0313 01:39:10.931159 19170 generic.go:334] "Generic (PLEG): container finished" podID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerID="27402947511fcd1871555601a56bb2d725747d59f9246315b043280235935c54" exitCode=0 Mar 13 01:39:10.932063 master-0 kubenswrapper[19170]: I0313 01:39:10.931272 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" event={"ID":"64ad250c-8ccc-42e7-935e-203a944b07b1","Type":"ContainerDied","Data":"27402947511fcd1871555601a56bb2d725747d59f9246315b043280235935c54"} Mar 13 01:39:10.932063 master-0 kubenswrapper[19170]: I0313 01:39:10.931438 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" event={"ID":"64ad250c-8ccc-42e7-935e-203a944b07b1","Type":"ContainerStarted","Data":"edaddc638c84825407386d58e963c641c4eca1417a097c01b8f7e26066c084d1"} Mar 13 01:39:10.940708 master-0 kubenswrapper[19170]: I0313 01:39:10.940623 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" Mar 13 01:39:10.940817 master-0 kubenswrapper[19170]: I0313 01:39:10.940618 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5z78gl" event={"ID":"5c0ac249-cb25-4ef4-bcf5-2f885b8526d8","Type":"ContainerDied","Data":"423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936"} Mar 13 01:39:10.940870 master-0 kubenswrapper[19170]: I0313 01:39:10.940818 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423dba269d66680cdbbedbd2c1b974af2a4722f9dd8962976b7cf2628d361936" Mar 13 01:39:10.943000 master-0 kubenswrapper[19170]: I0313 01:39:10.942957 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" event={"ID":"7f2121dd-1b58-47d4-b301-1a3456776032","Type":"ContainerDied","Data":"aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1"} Mar 13 01:39:10.943000 master-0 kubenswrapper[19170]: I0313 01:39:10.942989 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa94af9db497f26b84b465237d517da92dc669ec4bce497f75057f3bb0a375f1" Mar 13 01:39:10.943153 master-0 kubenswrapper[19170]: I0313 01:39:10.943044 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874lskjr" Mar 13 01:39:12.969733 master-0 kubenswrapper[19170]: I0313 01:39:12.969590 19170 generic.go:334] "Generic (PLEG): container finished" podID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerID="24b4dcf1de67da7e89def8ebf9e8e42a90ba956695f4fd82f5d60cb3f6b0e82e" exitCode=0 Mar 13 01:39:12.969733 master-0 kubenswrapper[19170]: I0313 01:39:12.969717 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" event={"ID":"64ad250c-8ccc-42e7-935e-203a944b07b1","Type":"ContainerDied","Data":"24b4dcf1de67da7e89def8ebf9e8e42a90ba956695f4fd82f5d60cb3f6b0e82e"} Mar 13 01:39:13.981403 master-0 kubenswrapper[19170]: I0313 01:39:13.981319 19170 generic.go:334] "Generic (PLEG): container finished" podID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerID="5246ce8f3b2dae74b46d250bc4a92a9e83fdc428481547964ffc79d6a456585e" exitCode=0 Mar 13 01:39:13.981403 master-0 kubenswrapper[19170]: I0313 01:39:13.981370 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" event={"ID":"64ad250c-8ccc-42e7-935e-203a944b07b1","Type":"ContainerDied","Data":"5246ce8f3b2dae74b46d250bc4a92a9e83fdc428481547964ffc79d6a456585e"} Mar 13 01:39:15.378217 master-0 kubenswrapper[19170]: I0313 01:39:15.378154 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:15.523559 master-0 kubenswrapper[19170]: I0313 01:39:15.522336 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxbmq\" (UniqueName: \"kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq\") pod \"64ad250c-8ccc-42e7-935e-203a944b07b1\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " Mar 13 01:39:15.523559 master-0 kubenswrapper[19170]: I0313 01:39:15.522434 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util\") pod \"64ad250c-8ccc-42e7-935e-203a944b07b1\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " Mar 13 01:39:15.523559 master-0 kubenswrapper[19170]: I0313 01:39:15.522541 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle\") pod \"64ad250c-8ccc-42e7-935e-203a944b07b1\" (UID: \"64ad250c-8ccc-42e7-935e-203a944b07b1\") " Mar 13 01:39:15.526826 master-0 kubenswrapper[19170]: I0313 01:39:15.526554 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle" (OuterVolumeSpecName: "bundle") pod "64ad250c-8ccc-42e7-935e-203a944b07b1" (UID: "64ad250c-8ccc-42e7-935e-203a944b07b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:15.533566 master-0 kubenswrapper[19170]: I0313 01:39:15.533522 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util" (OuterVolumeSpecName: "util") pod "64ad250c-8ccc-42e7-935e-203a944b07b1" (UID: "64ad250c-8ccc-42e7-935e-203a944b07b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:39:15.534417 master-0 kubenswrapper[19170]: I0313 01:39:15.534385 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq" (OuterVolumeSpecName: "kube-api-access-lxbmq") pod "64ad250c-8ccc-42e7-935e-203a944b07b1" (UID: "64ad250c-8ccc-42e7-935e-203a944b07b1"). InnerVolumeSpecName "kube-api-access-lxbmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:39:15.624417 master-0 kubenswrapper[19170]: I0313 01:39:15.624286 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:15.624417 master-0 kubenswrapper[19170]: I0313 01:39:15.624328 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxbmq\" (UniqueName: \"kubernetes.io/projected/64ad250c-8ccc-42e7-935e-203a944b07b1-kube-api-access-lxbmq\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:15.624417 master-0 kubenswrapper[19170]: I0313 01:39:15.624338 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64ad250c-8ccc-42e7-935e-203a944b07b1-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:39:16.000723 master-0 kubenswrapper[19170]: I0313 01:39:16.000598 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" event={"ID":"64ad250c-8ccc-42e7-935e-203a944b07b1","Type":"ContainerDied","Data":"edaddc638c84825407386d58e963c641c4eca1417a097c01b8f7e26066c084d1"} Mar 13 01:39:16.000723 master-0 kubenswrapper[19170]: I0313 01:39:16.000654 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edaddc638c84825407386d58e963c641c4eca1417a097c01b8f7e26066c084d1" Mar 13 01:39:16.000723 master-0 kubenswrapper[19170]: I0313 01:39:16.000716 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0877l2b" Mar 13 01:39:21.505716 master-0 kubenswrapper[19170]: I0313 01:39:21.505648 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-25zp4"] Mar 13 01:39:21.506795 master-0 kubenswrapper[19170]: E0313 01:39:21.506766 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="extract" Mar 13 01:39:21.506795 master-0 kubenswrapper[19170]: I0313 01:39:21.506790 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="extract" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506820 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="pull" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506829 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="pull" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506843 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506849 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506858 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="extract" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506864 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="extract" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506870 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506876 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506892 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506897 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="util" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: E0313 01:39:21.506909 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="pull" Mar 13 01:39:21.506907 master-0 kubenswrapper[19170]: I0313 01:39:21.506915 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="pull" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: E0313 01:39:21.506927 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="extract" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: I0313 01:39:21.506933 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="extract" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: E0313 01:39:21.506946 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="pull" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: I0313 01:39:21.506952 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="pull" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: I0313 01:39:21.507097 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0ac249-cb25-4ef4-bcf5-2f885b8526d8" containerName="extract" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: I0313 01:39:21.507112 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2121dd-1b58-47d4-b301-1a3456776032" containerName="extract" Mar 13 01:39:21.507256 master-0 kubenswrapper[19170]: I0313 01:39:21.507130 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ad250c-8ccc-42e7-935e-203a944b07b1" containerName="extract" Mar 13 01:39:21.507598 master-0 kubenswrapper[19170]: I0313 01:39:21.507574 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" Mar 13 01:39:21.509487 master-0 kubenswrapper[19170]: I0313 01:39:21.509438 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 01:39:21.510041 master-0 kubenswrapper[19170]: I0313 01:39:21.509867 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 01:39:21.520525 master-0 kubenswrapper[19170]: I0313 01:39:21.520188 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-25zp4"] Mar 13 01:39:21.524234 master-0 kubenswrapper[19170]: I0313 01:39:21.523850 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzfmm\" (UniqueName: \"kubernetes.io/projected/1e6870fc-49d7-41a3-b757-2c0eb0afb67d-kube-api-access-hzfmm\") pod \"nmstate-operator-796d4cfff4-25zp4\" (UID: \"1e6870fc-49d7-41a3-b757-2c0eb0afb67d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" Mar 13 01:39:21.625116 master-0 kubenswrapper[19170]: I0313 01:39:21.625063 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzfmm\" (UniqueName: \"kubernetes.io/projected/1e6870fc-49d7-41a3-b757-2c0eb0afb67d-kube-api-access-hzfmm\") pod \"nmstate-operator-796d4cfff4-25zp4\" (UID: \"1e6870fc-49d7-41a3-b757-2c0eb0afb67d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" Mar 13 01:39:21.643466 master-0 kubenswrapper[19170]: I0313 01:39:21.643407 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzfmm\" (UniqueName: \"kubernetes.io/projected/1e6870fc-49d7-41a3-b757-2c0eb0afb67d-kube-api-access-hzfmm\") pod \"nmstate-operator-796d4cfff4-25zp4\" (UID: \"1e6870fc-49d7-41a3-b757-2c0eb0afb67d\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" Mar 13 01:39:21.824536 master-0 kubenswrapper[19170]: I0313 01:39:21.824417 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" Mar 13 01:39:22.231666 master-0 kubenswrapper[19170]: I0313 01:39:22.224750 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-25zp4"] Mar 13 01:39:23.066014 master-0 kubenswrapper[19170]: I0313 01:39:23.065944 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" event={"ID":"1e6870fc-49d7-41a3-b757-2c0eb0afb67d","Type":"ContainerStarted","Data":"5a2d2b83435fb15642dea4219ec338709a19a36183b2ca9b258fb2f4c7bceac6"} Mar 13 01:39:24.106867 master-0 kubenswrapper[19170]: I0313 01:39:24.106815 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk"] Mar 13 01:39:24.121708 master-0 kubenswrapper[19170]: I0313 01:39:24.121670 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.127026 master-0 kubenswrapper[19170]: I0313 01:39:24.126988 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 01:39:24.127434 master-0 kubenswrapper[19170]: I0313 01:39:24.127420 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 01:39:24.133651 master-0 kubenswrapper[19170]: I0313 01:39:24.128707 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 01:39:24.133651 master-0 kubenswrapper[19170]: I0313 01:39:24.128971 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 01:39:24.146663 master-0 kubenswrapper[19170]: I0313 01:39:24.140652 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk"] Mar 13 01:39:24.287177 master-0 kubenswrapper[19170]: I0313 01:39:24.287117 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-webhook-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.287177 master-0 kubenswrapper[19170]: I0313 01:39:24.287177 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db64s\" (UniqueName: \"kubernetes.io/projected/d498deb9-5f9a-4606-8011-b88562ed0180-kube-api-access-db64s\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.287477 master-0 kubenswrapper[19170]: I0313 01:39:24.287202 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-apiservice-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.390774 master-0 kubenswrapper[19170]: I0313 01:39:24.390649 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db64s\" (UniqueName: \"kubernetes.io/projected/d498deb9-5f9a-4606-8011-b88562ed0180-kube-api-access-db64s\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.390774 master-0 kubenswrapper[19170]: I0313 01:39:24.390724 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-apiservice-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.391013 master-0 kubenswrapper[19170]: I0313 01:39:24.390868 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-webhook-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.394331 master-0 kubenswrapper[19170]: I0313 01:39:24.394237 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-apiservice-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.397251 master-0 kubenswrapper[19170]: I0313 01:39:24.397206 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d498deb9-5f9a-4606-8011-b88562ed0180-webhook-cert\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.432670 master-0 kubenswrapper[19170]: I0313 01:39:24.432617 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db64s\" (UniqueName: \"kubernetes.io/projected/d498deb9-5f9a-4606-8011-b88562ed0180-kube-api-access-db64s\") pod \"metallb-operator-controller-manager-57bc99bf8b-9v2vk\" (UID: \"d498deb9-5f9a-4606-8011-b88562ed0180\") " pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.473612 master-0 kubenswrapper[19170]: I0313 01:39:24.473541 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c94846845-ll9w6"] Mar 13 01:39:24.474713 master-0 kubenswrapper[19170]: I0313 01:39:24.474679 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.476701 master-0 kubenswrapper[19170]: I0313 01:39:24.476665 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 01:39:24.476769 master-0 kubenswrapper[19170]: I0313 01:39:24.476671 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 01:39:24.486623 master-0 kubenswrapper[19170]: I0313 01:39:24.486563 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c94846845-ll9w6"] Mar 13 01:39:24.542620 master-0 kubenswrapper[19170]: I0313 01:39:24.542558 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:24.606310 master-0 kubenswrapper[19170]: I0313 01:39:24.606253 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-webhook-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.606535 master-0 kubenswrapper[19170]: I0313 01:39:24.606336 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9srlq\" (UniqueName: \"kubernetes.io/projected/43bf0c13-9389-431a-a5f8-375b1533de9b-kube-api-access-9srlq\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.606535 master-0 kubenswrapper[19170]: I0313 01:39:24.606360 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-apiservice-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.710299 master-0 kubenswrapper[19170]: I0313 01:39:24.710218 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-webhook-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.710552 master-0 kubenswrapper[19170]: I0313 01:39:24.710340 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9srlq\" (UniqueName: \"kubernetes.io/projected/43bf0c13-9389-431a-a5f8-375b1533de9b-kube-api-access-9srlq\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.710552 master-0 kubenswrapper[19170]: I0313 01:39:24.710370 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-apiservice-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.714831 master-0 kubenswrapper[19170]: I0313 01:39:24.714786 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-apiservice-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.718843 master-0 kubenswrapper[19170]: I0313 01:39:24.718814 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43bf0c13-9389-431a-a5f8-375b1533de9b-webhook-cert\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.729829 master-0 kubenswrapper[19170]: I0313 01:39:24.729784 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9srlq\" (UniqueName: \"kubernetes.io/projected/43bf0c13-9389-431a-a5f8-375b1533de9b-kube-api-access-9srlq\") pod \"metallb-operator-webhook-server-c94846845-ll9w6\" (UID: \"43bf0c13-9389-431a-a5f8-375b1533de9b\") " pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.794991 master-0 kubenswrapper[19170]: I0313 01:39:24.794930 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:24.966615 master-0 kubenswrapper[19170]: I0313 01:39:24.966539 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk"] Mar 13 01:39:25.749762 master-0 kubenswrapper[19170]: W0313 01:39:25.748225 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd498deb9_5f9a_4606_8011_b88562ed0180.slice/crio-266128b153b4fcfe96601efb17d6d6166bcf2c0bf108fb8c1c3d3cb15d67ffd3 WatchSource:0}: Error finding container 266128b153b4fcfe96601efb17d6d6166bcf2c0bf108fb8c1c3d3cb15d67ffd3: Status 404 returned error can't find the container with id 266128b153b4fcfe96601efb17d6d6166bcf2c0bf108fb8c1c3d3cb15d67ffd3 Mar 13 01:39:26.105720 master-0 kubenswrapper[19170]: I0313 01:39:26.105623 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" event={"ID":"1e6870fc-49d7-41a3-b757-2c0eb0afb67d","Type":"ContainerStarted","Data":"e594da8d59f59a3ff559bea5f93e279049eeff26cb7d3feef78053f374c8fa99"} Mar 13 01:39:26.107683 master-0 kubenswrapper[19170]: I0313 01:39:26.107625 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" event={"ID":"d498deb9-5f9a-4606-8011-b88562ed0180","Type":"ContainerStarted","Data":"266128b153b4fcfe96601efb17d6d6166bcf2c0bf108fb8c1c3d3cb15d67ffd3"} Mar 13 01:39:26.125115 master-0 kubenswrapper[19170]: I0313 01:39:26.125025 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-25zp4" podStartSLOduration=1.5067846710000001 podStartE2EDuration="5.125013536s" podCreationTimestamp="2026-03-13 01:39:21 +0000 UTC" firstStartedPulling="2026-03-13 01:39:22.234479954 +0000 UTC m=+1223.042600934" lastFinishedPulling="2026-03-13 01:39:25.852708829 +0000 UTC m=+1226.660829799" observedRunningTime="2026-03-13 01:39:26.123536108 +0000 UTC m=+1226.931657068" watchObservedRunningTime="2026-03-13 01:39:26.125013536 +0000 UTC m=+1226.933134486" Mar 13 01:39:26.239764 master-0 kubenswrapper[19170]: I0313 01:39:26.239704 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c94846845-ll9w6"] Mar 13 01:39:26.244664 master-0 kubenswrapper[19170]: W0313 01:39:26.241715 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43bf0c13_9389_431a_a5f8_375b1533de9b.slice/crio-19c522b11ce60acb3718fbe64dbcf370c5454d03876c317fbe0545f0d794127c WatchSource:0}: Error finding container 19c522b11ce60acb3718fbe64dbcf370c5454d03876c317fbe0545f0d794127c: Status 404 returned error can't find the container with id 19c522b11ce60acb3718fbe64dbcf370c5454d03876c317fbe0545f0d794127c Mar 13 01:39:27.121670 master-0 kubenswrapper[19170]: I0313 01:39:27.121358 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" event={"ID":"43bf0c13-9389-431a-a5f8-375b1533de9b","Type":"ContainerStarted","Data":"19c522b11ce60acb3718fbe64dbcf370c5454d03876c317fbe0545f0d794127c"} Mar 13 01:39:30.148233 master-0 kubenswrapper[19170]: I0313 01:39:30.147882 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" event={"ID":"d498deb9-5f9a-4606-8011-b88562ed0180","Type":"ContainerStarted","Data":"bc664bdada928d8affb5dd50a79b14e13c5e9fe752be4b5b7bf695b7326f8c7a"} Mar 13 01:39:30.149158 master-0 kubenswrapper[19170]: I0313 01:39:30.148451 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:39:30.184501 master-0 kubenswrapper[19170]: I0313 01:39:30.184426 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" podStartSLOduration=2.487038202 podStartE2EDuration="6.184404059s" podCreationTimestamp="2026-03-13 01:39:24 +0000 UTC" firstStartedPulling="2026-03-13 01:39:25.762451524 +0000 UTC m=+1226.570572494" lastFinishedPulling="2026-03-13 01:39:29.459817381 +0000 UTC m=+1230.267938351" observedRunningTime="2026-03-13 01:39:30.174274567 +0000 UTC m=+1230.982395537" watchObservedRunningTime="2026-03-13 01:39:30.184404059 +0000 UTC m=+1230.992525019" Mar 13 01:39:34.175804 master-0 kubenswrapper[19170]: I0313 01:39:34.175744 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" event={"ID":"43bf0c13-9389-431a-a5f8-375b1533de9b","Type":"ContainerStarted","Data":"fb5ff0314bd1457cb4d173e53255d13e44e0b72534591e3258d5cf24660e3f43"} Mar 13 01:39:34.176341 master-0 kubenswrapper[19170]: I0313 01:39:34.175906 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:34.219031 master-0 kubenswrapper[19170]: I0313 01:39:34.218965 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" podStartSLOduration=2.883552085 podStartE2EDuration="10.218952056s" podCreationTimestamp="2026-03-13 01:39:24 +0000 UTC" firstStartedPulling="2026-03-13 01:39:26.252392668 +0000 UTC m=+1227.060513638" lastFinishedPulling="2026-03-13 01:39:33.587792659 +0000 UTC m=+1234.395913609" observedRunningTime="2026-03-13 01:39:34.214359875 +0000 UTC m=+1235.022480835" watchObservedRunningTime="2026-03-13 01:39:34.218952056 +0000 UTC m=+1235.027073016" Mar 13 01:39:39.034777 master-0 kubenswrapper[19170]: I0313 01:39:39.034692 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc"] Mar 13 01:39:39.035597 master-0 kubenswrapper[19170]: I0313 01:39:39.035576 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" Mar 13 01:39:39.047427 master-0 kubenswrapper[19170]: I0313 01:39:39.047388 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 01:39:39.049112 master-0 kubenswrapper[19170]: I0313 01:39:39.049068 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 01:39:39.059790 master-0 kubenswrapper[19170]: I0313 01:39:39.059739 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc"] Mar 13 01:39:39.126668 master-0 kubenswrapper[19170]: I0313 01:39:39.124597 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xmzk\" (UniqueName: \"kubernetes.io/projected/82149964-6826-4113-9f1d-31dca19f062e-kube-api-access-8xmzk\") pod \"obo-prometheus-operator-68bc856cb9-rcljc\" (UID: \"82149964-6826-4113-9f1d-31dca19f062e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" Mar 13 01:39:39.228655 master-0 kubenswrapper[19170]: I0313 01:39:39.226539 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xmzk\" (UniqueName: \"kubernetes.io/projected/82149964-6826-4113-9f1d-31dca19f062e-kube-api-access-8xmzk\") pod \"obo-prometheus-operator-68bc856cb9-rcljc\" (UID: \"82149964-6826-4113-9f1d-31dca19f062e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" Mar 13 01:39:39.254375 master-0 kubenswrapper[19170]: I0313 01:39:39.254034 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g"] Mar 13 01:39:39.255000 master-0 kubenswrapper[19170]: I0313 01:39:39.254980 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.260553 master-0 kubenswrapper[19170]: I0313 01:39:39.260513 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 01:39:39.276703 master-0 kubenswrapper[19170]: I0313 01:39:39.271745 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6"] Mar 13 01:39:39.279598 master-0 kubenswrapper[19170]: I0313 01:39:39.278855 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xmzk\" (UniqueName: \"kubernetes.io/projected/82149964-6826-4113-9f1d-31dca19f062e-kube-api-access-8xmzk\") pod \"obo-prometheus-operator-68bc856cb9-rcljc\" (UID: \"82149964-6826-4113-9f1d-31dca19f062e\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" Mar 13 01:39:39.279598 master-0 kubenswrapper[19170]: I0313 01:39:39.278868 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.291945 master-0 kubenswrapper[19170]: I0313 01:39:39.287678 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g"] Mar 13 01:39:39.316979 master-0 kubenswrapper[19170]: I0313 01:39:39.316916 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6"] Mar 13 01:39:39.327762 master-0 kubenswrapper[19170]: I0313 01:39:39.327683 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.327762 master-0 kubenswrapper[19170]: I0313 01:39:39.327777 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.328051 master-0 kubenswrapper[19170]: I0313 01:39:39.327795 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.328051 master-0 kubenswrapper[19170]: I0313 01:39:39.327832 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.350790 master-0 kubenswrapper[19170]: I0313 01:39:39.349661 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" Mar 13 01:39:39.429706 master-0 kubenswrapper[19170]: I0313 01:39:39.428527 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.429706 master-0 kubenswrapper[19170]: I0313 01:39:39.428655 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.429706 master-0 kubenswrapper[19170]: I0313 01:39:39.428703 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.429706 master-0 kubenswrapper[19170]: I0313 01:39:39.428720 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.431616 master-0 kubenswrapper[19170]: I0313 01:39:39.431575 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.434759 master-0 kubenswrapper[19170]: I0313 01:39:39.434732 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95f7f615-d532-44d6-8307-73a3bbfe78a7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6\" (UID: \"95f7f615-d532-44d6-8307-73a3bbfe78a7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.435583 master-0 kubenswrapper[19170]: I0313 01:39:39.435490 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.438271 master-0 kubenswrapper[19170]: I0313 01:39:39.438214 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4d09a044-f099-44bc-be50-5f7f9ff916cc-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g\" (UID: \"4d09a044-f099-44bc-be50-5f7f9ff916cc\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.488689 master-0 kubenswrapper[19170]: I0313 01:39:39.477320 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sfr46"] Mar 13 01:39:39.488689 master-0 kubenswrapper[19170]: I0313 01:39:39.480548 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.488689 master-0 kubenswrapper[19170]: I0313 01:39:39.487249 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sfr46"] Mar 13 01:39:39.488689 master-0 kubenswrapper[19170]: I0313 01:39:39.488067 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 01:39:39.538969 master-0 kubenswrapper[19170]: I0313 01:39:39.530356 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.538969 master-0 kubenswrapper[19170]: I0313 01:39:39.530510 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24ph\" (UniqueName: \"kubernetes.io/projected/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-kube-api-access-t24ph\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.634327 master-0 kubenswrapper[19170]: I0313 01:39:39.631400 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.634327 master-0 kubenswrapper[19170]: I0313 01:39:39.631494 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24ph\" (UniqueName: \"kubernetes.io/projected/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-kube-api-access-t24ph\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.639727 master-0 kubenswrapper[19170]: I0313 01:39:39.638340 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.639727 master-0 kubenswrapper[19170]: I0313 01:39:39.639377 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" Mar 13 01:39:39.642621 master-0 kubenswrapper[19170]: I0313 01:39:39.642566 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6x9f"] Mar 13 01:39:39.643491 master-0 kubenswrapper[19170]: I0313 01:39:39.643462 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.656186 master-0 kubenswrapper[19170]: I0313 01:39:39.655556 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24ph\" (UniqueName: \"kubernetes.io/projected/3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d-kube-api-access-t24ph\") pod \"observability-operator-59bdc8b94-sfr46\" (UID: \"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d\") " pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.664771 master-0 kubenswrapper[19170]: I0313 01:39:39.664725 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6x9f"] Mar 13 01:39:39.665217 master-0 kubenswrapper[19170]: I0313 01:39:39.665189 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" Mar 13 01:39:39.841660 master-0 kubenswrapper[19170]: I0313 01:39:39.833555 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:39.841660 master-0 kubenswrapper[19170]: I0313 01:39:39.834558 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc"] Mar 13 01:39:39.841660 master-0 kubenswrapper[19170]: I0313 01:39:39.840703 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.842103 master-0 kubenswrapper[19170]: I0313 01:39:39.841774 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shvs\" (UniqueName: \"kubernetes.io/projected/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-kube-api-access-7shvs\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.908565 master-0 kubenswrapper[19170]: W0313 01:39:39.908197 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82149964_6826_4113_9f1d_31dca19f062e.slice/crio-7a2ebdbda8d606af9b33b393c3111786fae73888ddfeddba8e108c76a8cc7949 WatchSource:0}: Error finding container 7a2ebdbda8d606af9b33b393c3111786fae73888ddfeddba8e108c76a8cc7949: Status 404 returned error can't find the container with id 7a2ebdbda8d606af9b33b393c3111786fae73888ddfeddba8e108c76a8cc7949 Mar 13 01:39:39.956727 master-0 kubenswrapper[19170]: I0313 01:39:39.945904 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shvs\" (UniqueName: \"kubernetes.io/projected/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-kube-api-access-7shvs\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.956988 master-0 kubenswrapper[19170]: I0313 01:39:39.956958 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.958093 master-0 kubenswrapper[19170]: I0313 01:39:39.958062 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-openshift-service-ca\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.967484 master-0 kubenswrapper[19170]: I0313 01:39:39.967451 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shvs\" (UniqueName: \"kubernetes.io/projected/2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8-kube-api-access-7shvs\") pod \"perses-operator-5bf474d74f-v6x9f\" (UID: \"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8\") " pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:39.990054 master-0 kubenswrapper[19170]: I0313 01:39:39.987658 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:40.216288 master-0 kubenswrapper[19170]: W0313 01:39:40.213983 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d09a044_f099_44bc_be50_5f7f9ff916cc.slice/crio-5ce019397e11f7941744cc86d93f6f9f49f02bc59efd8ba51de771318bd05194 WatchSource:0}: Error finding container 5ce019397e11f7941744cc86d93f6f9f49f02bc59efd8ba51de771318bd05194: Status 404 returned error can't find the container with id 5ce019397e11f7941744cc86d93f6f9f49f02bc59efd8ba51de771318bd05194 Mar 13 01:39:40.227713 master-0 kubenswrapper[19170]: I0313 01:39:40.227666 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g"] Mar 13 01:39:40.239435 master-0 kubenswrapper[19170]: I0313 01:39:40.239365 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6"] Mar 13 01:39:40.297699 master-0 kubenswrapper[19170]: I0313 01:39:40.288938 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" event={"ID":"95f7f615-d532-44d6-8307-73a3bbfe78a7","Type":"ContainerStarted","Data":"498ee5aeb9037f1848767bbb38eb03b3551a01376b31f55666514c2ab9c91191"} Mar 13 01:39:40.297699 master-0 kubenswrapper[19170]: I0313 01:39:40.290105 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" event={"ID":"82149964-6826-4113-9f1d-31dca19f062e","Type":"ContainerStarted","Data":"7a2ebdbda8d606af9b33b393c3111786fae73888ddfeddba8e108c76a8cc7949"} Mar 13 01:39:40.297699 master-0 kubenswrapper[19170]: I0313 01:39:40.290963 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" event={"ID":"4d09a044-f099-44bc-be50-5f7f9ff916cc","Type":"ContainerStarted","Data":"5ce019397e11f7941744cc86d93f6f9f49f02bc59efd8ba51de771318bd05194"} Mar 13 01:39:40.345324 master-0 kubenswrapper[19170]: I0313 01:39:40.345275 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sfr46"] Mar 13 01:39:40.549604 master-0 kubenswrapper[19170]: I0313 01:39:40.549480 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-v6x9f"] Mar 13 01:39:40.556312 master-0 kubenswrapper[19170]: W0313 01:39:40.556269 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9eaece_c8e3_40f8_8d22_a1cf7fc2c1a8.slice/crio-a465fa065a19527c766bf4ad5674818b466fc64755d98dcefb8dd6c232ee6a17 WatchSource:0}: Error finding container a465fa065a19527c766bf4ad5674818b466fc64755d98dcefb8dd6c232ee6a17: Status 404 returned error can't find the container with id a465fa065a19527c766bf4ad5674818b466fc64755d98dcefb8dd6c232ee6a17 Mar 13 01:39:41.300430 master-0 kubenswrapper[19170]: I0313 01:39:41.300327 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" event={"ID":"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8","Type":"ContainerStarted","Data":"a465fa065a19527c766bf4ad5674818b466fc64755d98dcefb8dd6c232ee6a17"} Mar 13 01:39:41.303418 master-0 kubenswrapper[19170]: I0313 01:39:41.303363 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" event={"ID":"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d","Type":"ContainerStarted","Data":"0c5ae886a6876d0f70964b5d0295a7187223e4dbf7a98f5f94e10f2354561962"} Mar 13 01:39:44.798785 master-0 kubenswrapper[19170]: I0313 01:39:44.798728 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c94846845-ll9w6" Mar 13 01:39:45.316789 master-0 kubenswrapper[19170]: I0313 01:39:45.316724 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2"] Mar 13 01:39:45.317895 master-0 kubenswrapper[19170]: I0313 01:39:45.317873 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.323495 master-0 kubenswrapper[19170]: I0313 01:39:45.323460 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 13 01:39:45.324299 master-0 kubenswrapper[19170]: I0313 01:39:45.324047 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 13 01:39:45.340390 master-0 kubenswrapper[19170]: I0313 01:39:45.340079 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2"] Mar 13 01:39:45.462981 master-0 kubenswrapper[19170]: I0313 01:39:45.462786 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55f3702c-4fa4-4686-9f14-39108f96b5f8-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.462981 master-0 kubenswrapper[19170]: I0313 01:39:45.462874 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgvb\" (UniqueName: \"kubernetes.io/projected/55f3702c-4fa4-4686-9f14-39108f96b5f8-kube-api-access-tmgvb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.565273 master-0 kubenswrapper[19170]: I0313 01:39:45.565219 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55f3702c-4fa4-4686-9f14-39108f96b5f8-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.566244 master-0 kubenswrapper[19170]: I0313 01:39:45.566201 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgvb\" (UniqueName: \"kubernetes.io/projected/55f3702c-4fa4-4686-9f14-39108f96b5f8-kube-api-access-tmgvb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.566593 master-0 kubenswrapper[19170]: I0313 01:39:45.566125 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/55f3702c-4fa4-4686-9f14-39108f96b5f8-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.581791 master-0 kubenswrapper[19170]: I0313 01:39:45.581699 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgvb\" (UniqueName: \"kubernetes.io/projected/55f3702c-4fa4-4686-9f14-39108f96b5f8-kube-api-access-tmgvb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-r6hs2\" (UID: \"55f3702c-4fa4-4686-9f14-39108f96b5f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:45.638116 master-0 kubenswrapper[19170]: I0313 01:39:45.638066 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" Mar 13 01:39:49.383582 master-0 kubenswrapper[19170]: I0313 01:39:49.383527 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2"] Mar 13 01:39:49.399882 master-0 kubenswrapper[19170]: I0313 01:39:49.399518 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" event={"ID":"2b9eaece-c8e3-40f8-8d22-a1cf7fc2c1a8","Type":"ContainerStarted","Data":"5ad3fc2a9fb5de6615cb67aa63de5f738a894eb23fa885bd3f733f5a3d742d3a"} Mar 13 01:39:49.401494 master-0 kubenswrapper[19170]: I0313 01:39:49.401451 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" event={"ID":"95f7f615-d532-44d6-8307-73a3bbfe78a7","Type":"ContainerStarted","Data":"770a95be4d274f244ccb403a9eb49e91ce3833927df7a1f3dfa6c563b5f3b29f"} Mar 13 01:39:49.402665 master-0 kubenswrapper[19170]: I0313 01:39:49.402601 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" event={"ID":"82149964-6826-4113-9f1d-31dca19f062e","Type":"ContainerStarted","Data":"59ef35fa18c66c8fc384248e7859010e3b1d5eb15dba23ccd2056a832a4be52e"} Mar 13 01:39:49.405617 master-0 kubenswrapper[19170]: I0313 01:39:49.405577 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" event={"ID":"4d09a044-f099-44bc-be50-5f7f9ff916cc","Type":"ContainerStarted","Data":"19a217dce20af1606672272cd41fba99f744900feded107d7c23bb9b1918cae5"} Mar 13 01:39:49.407388 master-0 kubenswrapper[19170]: I0313 01:39:49.407353 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" event={"ID":"3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d","Type":"ContainerStarted","Data":"ce66c3cb9da18586cc073e25e2c78f1c013be89dfd9169585b854108735c8de5"} Mar 13 01:39:49.407603 master-0 kubenswrapper[19170]: I0313 01:39:49.407577 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:49.408977 master-0 kubenswrapper[19170]: I0313 01:39:49.408940 19170 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-sfr46 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.128.0.131:8081/healthz\": dial tcp 10.128.0.131:8081: connect: connection refused" start-of-body= Mar 13 01:39:49.409023 master-0 kubenswrapper[19170]: I0313 01:39:49.408995 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" podUID="3fd4836b-f37b-4fda-a74d-d6e2e0d0e41d" containerName="operator" probeResult="failure" output="Get \"http://10.128.0.131:8081/healthz\": dial tcp 10.128.0.131:8081: connect: connection refused" Mar 13 01:39:49.433769 master-0 kubenswrapper[19170]: I0313 01:39:49.433689 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" podStartSLOduration=2.073771749 podStartE2EDuration="10.433668129s" podCreationTimestamp="2026-03-13 01:39:39 +0000 UTC" firstStartedPulling="2026-03-13 01:39:40.558959381 +0000 UTC m=+1241.367080341" lastFinishedPulling="2026-03-13 01:39:48.918855761 +0000 UTC m=+1249.726976721" observedRunningTime="2026-03-13 01:39:49.421361686 +0000 UTC m=+1250.229482666" watchObservedRunningTime="2026-03-13 01:39:49.433668129 +0000 UTC m=+1250.241789099" Mar 13 01:39:49.449143 master-0 kubenswrapper[19170]: I0313 01:39:49.447710 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rcljc" podStartSLOduration=2.45114435 podStartE2EDuration="11.447684628s" podCreationTimestamp="2026-03-13 01:39:38 +0000 UTC" firstStartedPulling="2026-03-13 01:39:39.922071225 +0000 UTC m=+1240.730192185" lastFinishedPulling="2026-03-13 01:39:48.918611503 +0000 UTC m=+1249.726732463" observedRunningTime="2026-03-13 01:39:49.439892063 +0000 UTC m=+1250.248013043" watchObservedRunningTime="2026-03-13 01:39:49.447684628 +0000 UTC m=+1250.255805598" Mar 13 01:39:49.509930 master-0 kubenswrapper[19170]: I0313 01:39:49.509860 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" podStartSLOduration=1.921148772 podStartE2EDuration="10.509844214s" podCreationTimestamp="2026-03-13 01:39:39 +0000 UTC" firstStartedPulling="2026-03-13 01:39:40.372555107 +0000 UTC m=+1241.180676077" lastFinishedPulling="2026-03-13 01:39:48.961250559 +0000 UTC m=+1249.769371519" observedRunningTime="2026-03-13 01:39:49.48376129 +0000 UTC m=+1250.291882250" watchObservedRunningTime="2026-03-13 01:39:49.509844214 +0000 UTC m=+1250.317965174" Mar 13 01:39:49.514722 master-0 kubenswrapper[19170]: I0313 01:39:49.514663 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qvf8g" podStartSLOduration=1.78764677 podStartE2EDuration="10.514646461s" podCreationTimestamp="2026-03-13 01:39:39 +0000 UTC" firstStartedPulling="2026-03-13 01:39:40.234286019 +0000 UTC m=+1241.042406969" lastFinishedPulling="2026-03-13 01:39:48.9612857 +0000 UTC m=+1249.769406660" observedRunningTime="2026-03-13 01:39:49.508773999 +0000 UTC m=+1250.316894959" watchObservedRunningTime="2026-03-13 01:39:49.514646461 +0000 UTC m=+1250.322767421" Mar 13 01:39:49.579525 master-0 kubenswrapper[19170]: I0313 01:39:49.579442 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6d886dcc57-qsvk6" podStartSLOduration=1.9103962289999998 podStartE2EDuration="10.579425322s" podCreationTimestamp="2026-03-13 01:39:39 +0000 UTC" firstStartedPulling="2026-03-13 01:39:40.2495851 +0000 UTC m=+1241.057706060" lastFinishedPulling="2026-03-13 01:39:48.918614183 +0000 UTC m=+1249.726735153" observedRunningTime="2026-03-13 01:39:49.574609845 +0000 UTC m=+1250.382730815" watchObservedRunningTime="2026-03-13 01:39:49.579425322 +0000 UTC m=+1250.387546302" Mar 13 01:39:49.836412 master-0 kubenswrapper[19170]: I0313 01:39:49.836338 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-sfr46" Mar 13 01:39:49.988652 master-0 kubenswrapper[19170]: I0313 01:39:49.988556 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:39:50.423708 master-0 kubenswrapper[19170]: I0313 01:39:50.423599 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" event={"ID":"55f3702c-4fa4-4686-9f14-39108f96b5f8","Type":"ContainerStarted","Data":"6f8118185211edad20cd070e271450441916c8e1ae570f7b9c27dd3259b5bdb5"} Mar 13 01:39:53.448268 master-0 kubenswrapper[19170]: I0313 01:39:53.448198 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" event={"ID":"55f3702c-4fa4-4686-9f14-39108f96b5f8","Type":"ContainerStarted","Data":"485b92871d6ee771c9343b0751f5e8514084716f41d6316a62664d0f56e416dc"} Mar 13 01:39:53.491743 master-0 kubenswrapper[19170]: I0313 01:39:53.491618 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-r6hs2" podStartSLOduration=5.154733322 podStartE2EDuration="8.491591474s" podCreationTimestamp="2026-03-13 01:39:45 +0000 UTC" firstStartedPulling="2026-03-13 01:39:49.393043869 +0000 UTC m=+1250.201164829" lastFinishedPulling="2026-03-13 01:39:52.72990198 +0000 UTC m=+1253.538022981" observedRunningTime="2026-03-13 01:39:53.478473604 +0000 UTC m=+1254.286594574" watchObservedRunningTime="2026-03-13 01:39:53.491591474 +0000 UTC m=+1254.299712434" Mar 13 01:39:55.844945 master-0 kubenswrapper[19170]: I0313 01:39:55.844886 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-zpblj"] Mar 13 01:39:55.846035 master-0 kubenswrapper[19170]: I0313 01:39:55.846011 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:55.848092 master-0 kubenswrapper[19170]: I0313 01:39:55.848049 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 01:39:55.848299 master-0 kubenswrapper[19170]: I0313 01:39:55.848255 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 01:39:55.854741 master-0 kubenswrapper[19170]: I0313 01:39:55.854693 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-zpblj"] Mar 13 01:39:55.956685 master-0 kubenswrapper[19170]: I0313 01:39:55.956589 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjs8\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-kube-api-access-lkjs8\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:55.957164 master-0 kubenswrapper[19170]: I0313 01:39:55.956719 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.058827 master-0 kubenswrapper[19170]: I0313 01:39:56.058742 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjs8\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-kube-api-access-lkjs8\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.058827 master-0 kubenswrapper[19170]: I0313 01:39:56.058834 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.081780 master-0 kubenswrapper[19170]: I0313 01:39:56.081715 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.092328 master-0 kubenswrapper[19170]: I0313 01:39:56.092276 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjs8\" (UniqueName: \"kubernetes.io/projected/4a81e112-2695-4b2f-89c0-b153cd3318c2-kube-api-access-lkjs8\") pod \"cert-manager-webhook-6888856db4-zpblj\" (UID: \"4a81e112-2695-4b2f-89c0-b153cd3318c2\") " pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.178053 master-0 kubenswrapper[19170]: I0313 01:39:56.177986 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:39:56.669835 master-0 kubenswrapper[19170]: I0313 01:39:56.666981 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-zpblj"] Mar 13 01:39:57.495859 master-0 kubenswrapper[19170]: I0313 01:39:57.495804 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" event={"ID":"4a81e112-2695-4b2f-89c0-b153cd3318c2","Type":"ContainerStarted","Data":"8e48846992109cd28c474525d5afc52032787301b0de4ce13650427b7a990f6f"} Mar 13 01:39:59.990820 master-0 kubenswrapper[19170]: I0313 01:39:59.990741 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-v6x9f" Mar 13 01:40:02.197892 master-0 kubenswrapper[19170]: I0313 01:40:02.197826 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rxjws"] Mar 13 01:40:02.198912 master-0 kubenswrapper[19170]: I0313 01:40:02.198892 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.239060 master-0 kubenswrapper[19170]: I0313 01:40:02.208219 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rxjws"] Mar 13 01:40:02.263161 master-0 kubenswrapper[19170]: I0313 01:40:02.262443 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psfgb\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-kube-api-access-psfgb\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.263161 master-0 kubenswrapper[19170]: I0313 01:40:02.262491 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.364236 master-0 kubenswrapper[19170]: I0313 01:40:02.364165 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psfgb\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-kube-api-access-psfgb\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.364236 master-0 kubenswrapper[19170]: I0313 01:40:02.364224 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.381127 master-0 kubenswrapper[19170]: I0313 01:40:02.381069 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psfgb\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-kube-api-access-psfgb\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.385965 master-0 kubenswrapper[19170]: I0313 01:40:02.385702 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3b3773b-a450-45c2-bdb5-84889603fde4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rxjws\" (UID: \"b3b3773b-a450-45c2-bdb5-84889603fde4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:02.588238 master-0 kubenswrapper[19170]: I0313 01:40:02.588153 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" Mar 13 01:40:03.082664 master-0 kubenswrapper[19170]: I0313 01:40:03.082136 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rxjws"] Mar 13 01:40:03.559167 master-0 kubenswrapper[19170]: I0313 01:40:03.559102 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" event={"ID":"b3b3773b-a450-45c2-bdb5-84889603fde4","Type":"ContainerStarted","Data":"4ee9255df5e43b03c6b8b85fe1889b519a871b9710a5e3e8f852bd9d8046d816"} Mar 13 01:40:03.559167 master-0 kubenswrapper[19170]: I0313 01:40:03.559163 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" event={"ID":"b3b3773b-a450-45c2-bdb5-84889603fde4","Type":"ContainerStarted","Data":"9add0c81b218afbbe2f34b5e040cf5b95e0802f7b4cc2df274b494b92a74c3b3"} Mar 13 01:40:03.560435 master-0 kubenswrapper[19170]: I0313 01:40:03.560385 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" event={"ID":"4a81e112-2695-4b2f-89c0-b153cd3318c2","Type":"ContainerStarted","Data":"b2410d7fd792f440f81c6a9e996e68e4cf06ebf00bd0212aabc0ad253f23edfb"} Mar 13 01:40:03.560565 master-0 kubenswrapper[19170]: I0313 01:40:03.560529 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:40:03.587764 master-0 kubenswrapper[19170]: I0313 01:40:03.587695 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-rxjws" podStartSLOduration=1.587677238 podStartE2EDuration="1.587677238s" podCreationTimestamp="2026-03-13 01:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:40:03.577581778 +0000 UTC m=+1264.385702738" watchObservedRunningTime="2026-03-13 01:40:03.587677238 +0000 UTC m=+1264.395798198" Mar 13 01:40:03.598650 master-0 kubenswrapper[19170]: I0313 01:40:03.596774 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" podStartSLOduration=2.717452077 podStartE2EDuration="8.596755636s" podCreationTimestamp="2026-03-13 01:39:55 +0000 UTC" firstStartedPulling="2026-03-13 01:39:56.674929048 +0000 UTC m=+1257.483050028" lastFinishedPulling="2026-03-13 01:40:02.554232627 +0000 UTC m=+1263.362353587" observedRunningTime="2026-03-13 01:40:03.593131237 +0000 UTC m=+1264.401252197" watchObservedRunningTime="2026-03-13 01:40:03.596755636 +0000 UTC m=+1264.404876586" Mar 13 01:40:04.545200 master-0 kubenswrapper[19170]: I0313 01:40:04.545150 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57bc99bf8b-9v2vk" Mar 13 01:40:11.182421 master-0 kubenswrapper[19170]: I0313 01:40:11.182372 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-zpblj" Mar 13 01:40:14.764249 master-0 kubenswrapper[19170]: I0313 01:40:14.764134 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-9rjc4"] Mar 13 01:40:14.766407 master-0 kubenswrapper[19170]: I0313 01:40:14.766343 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.774373 master-0 kubenswrapper[19170]: I0313 01:40:14.774319 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-bound-sa-token\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.774830 master-0 kubenswrapper[19170]: I0313 01:40:14.774794 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7kkh\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-kube-api-access-g7kkh\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.803473 master-0 kubenswrapper[19170]: I0313 01:40:14.803404 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9rjc4"] Mar 13 01:40:14.877013 master-0 kubenswrapper[19170]: I0313 01:40:14.876922 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7kkh\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-kube-api-access-g7kkh\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.877192 master-0 kubenswrapper[19170]: I0313 01:40:14.877030 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-bound-sa-token\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.899666 master-0 kubenswrapper[19170]: I0313 01:40:14.899579 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7kkh\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-kube-api-access-g7kkh\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:14.909525 master-0 kubenswrapper[19170]: I0313 01:40:14.909447 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b684c2c4-bca5-40f3-89e5-d148b2483763-bound-sa-token\") pod \"cert-manager-545d4d4674-9rjc4\" (UID: \"b684c2c4-bca5-40f3-89e5-d148b2483763\") " pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:15.125241 master-0 kubenswrapper[19170]: I0313 01:40:15.125036 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-9rjc4" Mar 13 01:40:15.668869 master-0 kubenswrapper[19170]: I0313 01:40:15.668511 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-9rjc4"] Mar 13 01:40:15.673788 master-0 kubenswrapper[19170]: W0313 01:40:15.673665 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb684c2c4_bca5_40f3_89e5_d148b2483763.slice/crio-c60c4a6c70e216d355e32bbc380faced2c890ce6ef69b5575f46fbd3c7479ac7 WatchSource:0}: Error finding container c60c4a6c70e216d355e32bbc380faced2c890ce6ef69b5575f46fbd3c7479ac7: Status 404 returned error can't find the container with id c60c4a6c70e216d355e32bbc380faced2c890ce6ef69b5575f46fbd3c7479ac7 Mar 13 01:40:16.672773 master-0 kubenswrapper[19170]: I0313 01:40:16.672686 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9rjc4" event={"ID":"b684c2c4-bca5-40f3-89e5-d148b2483763","Type":"ContainerStarted","Data":"b3ada2a3748b6273619a90681e4d9223ae78ce7cd667ec4ec28c75f604ab1604"} Mar 13 01:40:16.672773 master-0 kubenswrapper[19170]: I0313 01:40:16.672767 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-9rjc4" event={"ID":"b684c2c4-bca5-40f3-89e5-d148b2483763","Type":"ContainerStarted","Data":"c60c4a6c70e216d355e32bbc380faced2c890ce6ef69b5575f46fbd3c7479ac7"} Mar 13 01:40:16.717808 master-0 kubenswrapper[19170]: I0313 01:40:16.717695 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-9rjc4" podStartSLOduration=2.717666092 podStartE2EDuration="2.717666092s" podCreationTimestamp="2026-03-13 01:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:40:16.711555481 +0000 UTC m=+1277.519676481" watchObservedRunningTime="2026-03-13 01:40:16.717666092 +0000 UTC m=+1277.525787082" Mar 13 01:40:18.776585 master-0 kubenswrapper[19170]: I0313 01:40:18.776513 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj"] Mar 13 01:40:18.778186 master-0 kubenswrapper[19170]: I0313 01:40:18.778121 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:18.786869 master-0 kubenswrapper[19170]: I0313 01:40:18.786800 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 01:40:18.826822 master-0 kubenswrapper[19170]: I0313 01:40:18.826753 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-pfmr9"] Mar 13 01:40:18.836183 master-0 kubenswrapper[19170]: I0313 01:40:18.836113 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.837148 master-0 kubenswrapper[19170]: I0313 01:40:18.837107 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj"] Mar 13 01:40:18.845028 master-0 kubenswrapper[19170]: I0313 01:40:18.844970 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 01:40:18.845272 master-0 kubenswrapper[19170]: I0313 01:40:18.845253 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 01:40:18.903009 master-0 kubenswrapper[19170]: I0313 01:40:18.902947 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qx8lb"] Mar 13 01:40:18.908589 master-0 kubenswrapper[19170]: I0313 01:40:18.908493 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qx8lb" Mar 13 01:40:18.910061 master-0 kubenswrapper[19170]: I0313 01:40:18.910043 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 01:40:18.910497 master-0 kubenswrapper[19170]: I0313 01:40:18.910482 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 01:40:18.910658 master-0 kubenswrapper[19170]: I0313 01:40:18.910645 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 01:40:18.930882 master-0 kubenswrapper[19170]: I0313 01:40:18.927799 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-667wg"] Mar 13 01:40:18.930882 master-0 kubenswrapper[19170]: I0313 01:40:18.929386 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:18.933739 master-0 kubenswrapper[19170]: I0313 01:40:18.931227 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 01:40:18.954852 master-0 kubenswrapper[19170]: I0313 01:40:18.954808 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-sockets\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.954852 master-0 kubenswrapper[19170]: I0313 01:40:18.954854 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-reloader\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954873 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw45g\" (UniqueName: \"kubernetes.io/projected/1ef92c3a-7b62-42e8-909b-1cadf7157035-kube-api-access-dw45g\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954895 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-metrics-certs\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954917 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nz2p\" (UniqueName: \"kubernetes.io/projected/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-kube-api-access-8nz2p\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954946 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954962 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metrics-certs\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954977 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.954992 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics-certs\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955015 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-startup\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955041 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metallb-excludel2\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955056 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-cert\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955077 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nql8w\" (UniqueName: \"kubernetes.io/projected/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-kube-api-access-nql8w\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955099 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:18.955116 master-0 kubenswrapper[19170]: I0313 01:40:18.955116 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-conf\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:18.957144 master-0 kubenswrapper[19170]: I0313 01:40:18.955138 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68wp8\" (UniqueName: \"kubernetes.io/projected/fd247073-2d90-4297-b745-d3b906c5f27d-kube-api-access-68wp8\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:18.967597 master-0 kubenswrapper[19170]: I0313 01:40:18.967434 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-667wg"] Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056052 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-sockets\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056114 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-reloader\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056398 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw45g\" (UniqueName: \"kubernetes.io/projected/1ef92c3a-7b62-42e8-909b-1cadf7157035-kube-api-access-dw45g\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056492 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-metrics-certs\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056507 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-sockets\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056554 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nz2p\" (UniqueName: \"kubernetes.io/projected/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-kube-api-access-8nz2p\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056671 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056699 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metrics-certs\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.056720 master-0 kubenswrapper[19170]: I0313 01:40:19.056718 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.056738 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics-certs\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.056774 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-reloader\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.056794 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-startup\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.056852 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metallb-excludel2\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.056880 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-cert\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: E0313 01:40:19.056911 19170 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: E0313 01:40:19.056979 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist podName:1341d719-fa4e-4ac5-b0ab-536f13b8e3e1 nodeName:}" failed. No retries permitted until 2026-03-13 01:40:19.556955195 +0000 UTC m=+1280.365076235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist") pod "speaker-qx8lb" (UID: "1341d719-fa4e-4ac5-b0ab-536f13b8e3e1") : secret "metallb-memberlist" not found Mar 13 01:40:19.057257 master-0 kubenswrapper[19170]: I0313 01:40:19.057142 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.057613 master-0 kubenswrapper[19170]: I0313 01:40:19.056914 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nql8w\" (UniqueName: \"kubernetes.io/projected/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-kube-api-access-nql8w\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.057613 master-0 kubenswrapper[19170]: I0313 01:40:19.057341 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:19.057613 master-0 kubenswrapper[19170]: I0313 01:40:19.057366 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-conf\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.057613 master-0 kubenswrapper[19170]: I0313 01:40:19.057405 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68wp8\" (UniqueName: \"kubernetes.io/projected/fd247073-2d90-4297-b745-d3b906c5f27d-kube-api-access-68wp8\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.057822 master-0 kubenswrapper[19170]: I0313 01:40:19.057797 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metallb-excludel2\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.058333 master-0 kubenswrapper[19170]: I0313 01:40:19.058065 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-startup\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.058333 master-0 kubenswrapper[19170]: I0313 01:40:19.058326 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1ef92c3a-7b62-42e8-909b-1cadf7157035-frr-conf\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.060501 master-0 kubenswrapper[19170]: I0313 01:40:19.059738 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ef92c3a-7b62-42e8-909b-1cadf7157035-metrics-certs\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.060501 master-0 kubenswrapper[19170]: I0313 01:40:19.060074 19170 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 01:40:19.062985 master-0 kubenswrapper[19170]: I0313 01:40:19.061311 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-metrics-certs\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.068526 master-0 kubenswrapper[19170]: I0313 01:40:19.068270 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:19.068526 master-0 kubenswrapper[19170]: I0313 01:40:19.068314 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-metrics-certs\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.073672 master-0 kubenswrapper[19170]: I0313 01:40:19.071699 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd247073-2d90-4297-b745-d3b906c5f27d-cert\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.073672 master-0 kubenswrapper[19170]: I0313 01:40:19.072851 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68wp8\" (UniqueName: \"kubernetes.io/projected/fd247073-2d90-4297-b745-d3b906c5f27d-kube-api-access-68wp8\") pod \"controller-7bb4cc7c98-667wg\" (UID: \"fd247073-2d90-4297-b745-d3b906c5f27d\") " pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.073920 master-0 kubenswrapper[19170]: I0313 01:40:19.073896 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw45g\" (UniqueName: \"kubernetes.io/projected/1ef92c3a-7b62-42e8-909b-1cadf7157035-kube-api-access-dw45g\") pod \"frr-k8s-pfmr9\" (UID: \"1ef92c3a-7b62-42e8-909b-1cadf7157035\") " pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.075231 master-0 kubenswrapper[19170]: I0313 01:40:19.075194 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nql8w\" (UniqueName: \"kubernetes.io/projected/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-kube-api-access-nql8w\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.075662 master-0 kubenswrapper[19170]: I0313 01:40:19.075623 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nz2p\" (UniqueName: \"kubernetes.io/projected/d82ad9fb-f003-4e1d-a30a-f029eba41ea9-kube-api-access-8nz2p\") pod \"frr-k8s-webhook-server-bcc4b6f68-wqmlj\" (UID: \"d82ad9fb-f003-4e1d-a30a-f029eba41ea9\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:19.108413 master-0 kubenswrapper[19170]: I0313 01:40:19.108333 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:19.194324 master-0 kubenswrapper[19170]: I0313 01:40:19.194224 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:19.284209 master-0 kubenswrapper[19170]: I0313 01:40:19.278092 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:19.557202 master-0 kubenswrapper[19170]: I0313 01:40:19.556742 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj"] Mar 13 01:40:19.560450 master-0 kubenswrapper[19170]: W0313 01:40:19.560367 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd82ad9fb_f003_4e1d_a30a_f029eba41ea9.slice/crio-baee0a370a2260922a1af999060c4d3699a92d9d470543765a252321f75d40ad WatchSource:0}: Error finding container baee0a370a2260922a1af999060c4d3699a92d9d470543765a252321f75d40ad: Status 404 returned error can't find the container with id baee0a370a2260922a1af999060c4d3699a92d9d470543765a252321f75d40ad Mar 13 01:40:19.569605 master-0 kubenswrapper[19170]: I0313 01:40:19.569560 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:19.569718 master-0 kubenswrapper[19170]: E0313 01:40:19.569693 19170 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 01:40:19.569774 master-0 kubenswrapper[19170]: E0313 01:40:19.569759 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist podName:1341d719-fa4e-4ac5-b0ab-536f13b8e3e1 nodeName:}" failed. No retries permitted until 2026-03-13 01:40:20.569740258 +0000 UTC m=+1281.377861218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist") pod "speaker-qx8lb" (UID: "1341d719-fa4e-4ac5-b0ab-536f13b8e3e1") : secret "metallb-memberlist" not found Mar 13 01:40:19.703005 master-0 kubenswrapper[19170]: I0313 01:40:19.702932 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" event={"ID":"d82ad9fb-f003-4e1d-a30a-f029eba41ea9","Type":"ContainerStarted","Data":"baee0a370a2260922a1af999060c4d3699a92d9d470543765a252321f75d40ad"} Mar 13 01:40:19.705068 master-0 kubenswrapper[19170]: I0313 01:40:19.705028 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"5b2b028b82d1790fc58f68a453320dc63c0a17df7da37b47d959cecc38aa94c5"} Mar 13 01:40:19.732838 master-0 kubenswrapper[19170]: I0313 01:40:19.732769 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-667wg"] Mar 13 01:40:19.739102 master-0 kubenswrapper[19170]: W0313 01:40:19.739050 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd247073_2d90_4297_b745_d3b906c5f27d.slice/crio-a0b8c2a67f36df6a35ab288903a826c2a48d0c484053c7c48cddec06b4b820ee WatchSource:0}: Error finding container a0b8c2a67f36df6a35ab288903a826c2a48d0c484053c7c48cddec06b4b820ee: Status 404 returned error can't find the container with id a0b8c2a67f36df6a35ab288903a826c2a48d0c484053c7c48cddec06b4b820ee Mar 13 01:40:20.588047 master-0 kubenswrapper[19170]: I0313 01:40:20.587372 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:20.593184 master-0 kubenswrapper[19170]: I0313 01:40:20.593144 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1341d719-fa4e-4ac5-b0ab-536f13b8e3e1-memberlist\") pod \"speaker-qx8lb\" (UID: \"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1\") " pod="metallb-system/speaker-qx8lb" Mar 13 01:40:20.713349 master-0 kubenswrapper[19170]: I0313 01:40:20.713303 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-667wg" event={"ID":"fd247073-2d90-4297-b745-d3b906c5f27d","Type":"ContainerStarted","Data":"92e1ed2cb6ded4a23a52592c7fb43d10e8a9e0995c4fe8880d9c619c824fb214"} Mar 13 01:40:20.713851 master-0 kubenswrapper[19170]: I0313 01:40:20.713833 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-667wg" event={"ID":"fd247073-2d90-4297-b745-d3b906c5f27d","Type":"ContainerStarted","Data":"a0b8c2a67f36df6a35ab288903a826c2a48d0c484053c7c48cddec06b4b820ee"} Mar 13 01:40:20.747477 master-0 kubenswrapper[19170]: I0313 01:40:20.747432 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qx8lb" Mar 13 01:40:20.776544 master-0 kubenswrapper[19170]: W0313 01:40:20.776477 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1341d719_fa4e_4ac5_b0ab_536f13b8e3e1.slice/crio-d2d7acaa9978dc81a6af1fcf3a915e06f98158e302fdbe94baa790ec128833a7 WatchSource:0}: Error finding container d2d7acaa9978dc81a6af1fcf3a915e06f98158e302fdbe94baa790ec128833a7: Status 404 returned error can't find the container with id d2d7acaa9978dc81a6af1fcf3a915e06f98158e302fdbe94baa790ec128833a7 Mar 13 01:40:20.918569 master-0 kubenswrapper[19170]: I0313 01:40:20.916229 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x"] Mar 13 01:40:20.918569 master-0 kubenswrapper[19170]: I0313 01:40:20.918068 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" Mar 13 01:40:20.954447 master-0 kubenswrapper[19170]: I0313 01:40:20.952175 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mgg76"] Mar 13 01:40:20.954447 master-0 kubenswrapper[19170]: I0313 01:40:20.953579 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:20.957938 master-0 kubenswrapper[19170]: I0313 01:40:20.955294 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 01:40:20.987152 master-0 kubenswrapper[19170]: I0313 01:40:20.987091 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x"] Mar 13 01:40:21.000366 master-0 kubenswrapper[19170]: I0313 01:40:20.999956 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mgg76"] Mar 13 01:40:21.012253 master-0 kubenswrapper[19170]: I0313 01:40:21.012180 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-72q4d"] Mar 13 01:40:21.014561 master-0 kubenswrapper[19170]: I0313 01:40:21.013948 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.071701 master-0 kubenswrapper[19170]: I0313 01:40:21.068670 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z"] Mar 13 01:40:21.071701 master-0 kubenswrapper[19170]: I0313 01:40:21.070020 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.071926 master-0 kubenswrapper[19170]: I0313 01:40:21.071819 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 01:40:21.072688 master-0 kubenswrapper[19170]: I0313 01:40:21.072044 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 01:40:21.081211 master-0 kubenswrapper[19170]: I0313 01:40:21.080532 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z"] Mar 13 01:40:21.094585 master-0 kubenswrapper[19170]: I0313 01:40:21.093679 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/28e38789-7aec-4807-bab0-f2cf3f316573-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.094585 master-0 kubenswrapper[19170]: I0313 01:40:21.093750 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sxss\" (UniqueName: \"kubernetes.io/projected/5c56b3e6-e84b-4dae-8005-3d0af50aadfb-kube-api-access-5sxss\") pod \"nmstate-metrics-9b8c8685d-g2t7x\" (UID: \"5c56b3e6-e84b-4dae-8005-3d0af50aadfb\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" Mar 13 01:40:21.094585 master-0 kubenswrapper[19170]: I0313 01:40:21.093780 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59475\" (UniqueName: \"kubernetes.io/projected/28e38789-7aec-4807-bab0-f2cf3f316573-kube-api-access-59475\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.195146 master-0 kubenswrapper[19170]: I0313 01:40:21.195034 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr8bd\" (UniqueName: \"kubernetes.io/projected/87f190ab-00ff-47fd-8392-3185fc8bab6f-kube-api-access-tr8bd\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.195146 master-0 kubenswrapper[19170]: I0313 01:40:21.195111 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/28e38789-7aec-4807-bab0-f2cf3f316573-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.195146 master-0 kubenswrapper[19170]: I0313 01:40:21.195139 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/82e70dc1-5983-4831-9b27-9771974d4f47-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.195370 master-0 kubenswrapper[19170]: I0313 01:40:21.195164 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e70dc1-5983-4831-9b27-9771974d4f47-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.197075 master-0 kubenswrapper[19170]: I0313 01:40:21.197033 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-ovs-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.197075 master-0 kubenswrapper[19170]: I0313 01:40:21.197075 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-dbus-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.197208 master-0 kubenswrapper[19170]: I0313 01:40:21.197100 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sxss\" (UniqueName: \"kubernetes.io/projected/5c56b3e6-e84b-4dae-8005-3d0af50aadfb-kube-api-access-5sxss\") pod \"nmstate-metrics-9b8c8685d-g2t7x\" (UID: \"5c56b3e6-e84b-4dae-8005-3d0af50aadfb\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" Mar 13 01:40:21.197718 master-0 kubenswrapper[19170]: I0313 01:40:21.197650 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59475\" (UniqueName: \"kubernetes.io/projected/28e38789-7aec-4807-bab0-f2cf3f316573-kube-api-access-59475\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.197784 master-0 kubenswrapper[19170]: I0313 01:40:21.197770 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-nmstate-lock\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.197915 master-0 kubenswrapper[19170]: I0313 01:40:21.197817 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/82e70dc1-5983-4831-9b27-9771974d4f47-kube-api-access-knkwd\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.203078 master-0 kubenswrapper[19170]: I0313 01:40:21.203038 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/28e38789-7aec-4807-bab0-f2cf3f316573-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.214286 master-0 kubenswrapper[19170]: I0313 01:40:21.214241 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59475\" (UniqueName: \"kubernetes.io/projected/28e38789-7aec-4807-bab0-f2cf3f316573-kube-api-access-59475\") pod \"nmstate-webhook-5f558f5558-mgg76\" (UID: \"28e38789-7aec-4807-bab0-f2cf3f316573\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.219973 master-0 kubenswrapper[19170]: I0313 01:40:21.219758 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sxss\" (UniqueName: \"kubernetes.io/projected/5c56b3e6-e84b-4dae-8005-3d0af50aadfb-kube-api-access-5sxss\") pod \"nmstate-metrics-9b8c8685d-g2t7x\" (UID: \"5c56b3e6-e84b-4dae-8005-3d0af50aadfb\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" Mar 13 01:40:21.262334 master-0 kubenswrapper[19170]: I0313 01:40:21.262288 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" Mar 13 01:40:21.266183 master-0 kubenswrapper[19170]: I0313 01:40:21.266148 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6494dc8c6b-x76zk"] Mar 13 01:40:21.268924 master-0 kubenswrapper[19170]: I0313 01:40:21.268892 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.279745 master-0 kubenswrapper[19170]: I0313 01:40:21.277488 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6494dc8c6b-x76zk"] Mar 13 01:40:21.292535 master-0 kubenswrapper[19170]: I0313 01:40:21.292396 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:21.298905 master-0 kubenswrapper[19170]: I0313 01:40:21.298865 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/82e70dc1-5983-4831-9b27-9771974d4f47-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.298980 master-0 kubenswrapper[19170]: I0313 01:40:21.298919 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e70dc1-5983-4831-9b27-9771974d4f47-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.298980 master-0 kubenswrapper[19170]: I0313 01:40:21.298947 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-ovs-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.298980 master-0 kubenswrapper[19170]: I0313 01:40:21.298962 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-dbus-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.299064 master-0 kubenswrapper[19170]: I0313 01:40:21.299001 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-nmstate-lock\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.299064 master-0 kubenswrapper[19170]: I0313 01:40:21.299018 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/82e70dc1-5983-4831-9b27-9771974d4f47-kube-api-access-knkwd\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.300342 master-0 kubenswrapper[19170]: I0313 01:40:21.299069 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr8bd\" (UniqueName: \"kubernetes.io/projected/87f190ab-00ff-47fd-8392-3185fc8bab6f-kube-api-access-tr8bd\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.300342 master-0 kubenswrapper[19170]: I0313 01:40:21.299274 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-ovs-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.300444 master-0 kubenswrapper[19170]: I0313 01:40:21.300167 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-nmstate-lock\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.300444 master-0 kubenswrapper[19170]: I0313 01:40:21.300310 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/82e70dc1-5983-4831-9b27-9771974d4f47-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.300501 master-0 kubenswrapper[19170]: I0313 01:40:21.300444 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/87f190ab-00ff-47fd-8392-3185fc8bab6f-dbus-socket\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.305770 master-0 kubenswrapper[19170]: I0313 01:40:21.305606 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e70dc1-5983-4831-9b27-9771974d4f47-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.326890 master-0 kubenswrapper[19170]: I0313 01:40:21.322139 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr8bd\" (UniqueName: \"kubernetes.io/projected/87f190ab-00ff-47fd-8392-3185fc8bab6f-kube-api-access-tr8bd\") pod \"nmstate-handler-72q4d\" (UID: \"87f190ab-00ff-47fd-8392-3185fc8bab6f\") " pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.334293 master-0 kubenswrapper[19170]: I0313 01:40:21.334219 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkwd\" (UniqueName: \"kubernetes.io/projected/82e70dc1-5983-4831-9b27-9771974d4f47-kube-api-access-knkwd\") pod \"nmstate-console-plugin-86f58fcf4-rcf6z\" (UID: \"82e70dc1-5983-4831-9b27-9771974d4f47\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.376275 master-0 kubenswrapper[19170]: I0313 01:40:21.376076 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402086 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-service-ca\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402138 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402163 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-trusted-ca-bundle\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402200 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-oauth-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402214 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-oauth-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402233 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6nsk\" (UniqueName: \"kubernetes.io/projected/eda172fc-916c-4817-815a-79dd06704d36-kube-api-access-m6nsk\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.402388 master-0 kubenswrapper[19170]: I0313 01:40:21.402263 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-console-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.410038 master-0 kubenswrapper[19170]: I0313 01:40:21.409656 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503176 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-service-ca\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503248 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503326 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-trusted-ca-bundle\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503370 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-oauth-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503391 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-oauth-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503411 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6nsk\" (UniqueName: \"kubernetes.io/projected/eda172fc-916c-4817-815a-79dd06704d36-kube-api-access-m6nsk\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.503452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-console-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.506928 master-0 kubenswrapper[19170]: I0313 01:40:21.505687 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-console-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.507282 master-0 kubenswrapper[19170]: I0313 01:40:21.507083 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-oauth-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.507317 master-0 kubenswrapper[19170]: I0313 01:40:21.507270 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-service-ca\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.514565 master-0 kubenswrapper[19170]: I0313 01:40:21.507992 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda172fc-916c-4817-815a-79dd06704d36-trusted-ca-bundle\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.514565 master-0 kubenswrapper[19170]: I0313 01:40:21.509590 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-serving-cert\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.514565 master-0 kubenswrapper[19170]: I0313 01:40:21.514466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eda172fc-916c-4817-815a-79dd06704d36-console-oauth-config\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.538066 master-0 kubenswrapper[19170]: I0313 01:40:21.530070 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6nsk\" (UniqueName: \"kubernetes.io/projected/eda172fc-916c-4817-815a-79dd06704d36-kube-api-access-m6nsk\") pod \"console-6494dc8c6b-x76zk\" (UID: \"eda172fc-916c-4817-815a-79dd06704d36\") " pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.657950 master-0 kubenswrapper[19170]: I0313 01:40:21.657888 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:21.724503 master-0 kubenswrapper[19170]: I0313 01:40:21.724373 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-72q4d" event={"ID":"87f190ab-00ff-47fd-8392-3185fc8bab6f","Type":"ContainerStarted","Data":"c66177043536cdba365f67114197c2226c71661c481aa0fa4f3c31d7de8061e9"} Mar 13 01:40:21.735356 master-0 kubenswrapper[19170]: I0313 01:40:21.734087 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qx8lb" event={"ID":"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1","Type":"ContainerStarted","Data":"ba39d0fe6456eb76979198f4553197b3744828c0ed398284c71a9b80bc47a471"} Mar 13 01:40:21.735356 master-0 kubenswrapper[19170]: I0313 01:40:21.734149 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qx8lb" event={"ID":"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1","Type":"ContainerStarted","Data":"d2d7acaa9978dc81a6af1fcf3a915e06f98158e302fdbe94baa790ec128833a7"} Mar 13 01:40:21.822335 master-0 kubenswrapper[19170]: I0313 01:40:21.822293 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x"] Mar 13 01:40:21.909545 master-0 kubenswrapper[19170]: W0313 01:40:21.909492 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e38789_7aec_4807_bab0_f2cf3f316573.slice/crio-48e1251ba5205bf66b733fb7b8585983fbb07644c54b5d8c6046168b6ee9c67c WatchSource:0}: Error finding container 48e1251ba5205bf66b733fb7b8585983fbb07644c54b5d8c6046168b6ee9c67c: Status 404 returned error can't find the container with id 48e1251ba5205bf66b733fb7b8585983fbb07644c54b5d8c6046168b6ee9c67c Mar 13 01:40:21.923015 master-0 kubenswrapper[19170]: I0313 01:40:21.922966 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-mgg76"] Mar 13 01:40:22.004507 master-0 kubenswrapper[19170]: I0313 01:40:22.004457 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z"] Mar 13 01:40:22.006725 master-0 kubenswrapper[19170]: W0313 01:40:22.006689 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e70dc1_5983_4831_9b27_9771974d4f47.slice/crio-27276ba2bb5421a04e98c8d5e9d92844770500482740f063fe3cdd6fefa2fad4 WatchSource:0}: Error finding container 27276ba2bb5421a04e98c8d5e9d92844770500482740f063fe3cdd6fefa2fad4: Status 404 returned error can't find the container with id 27276ba2bb5421a04e98c8d5e9d92844770500482740f063fe3cdd6fefa2fad4 Mar 13 01:40:22.108850 master-0 kubenswrapper[19170]: I0313 01:40:22.108822 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6494dc8c6b-x76zk"] Mar 13 01:40:22.746175 master-0 kubenswrapper[19170]: I0313 01:40:22.746108 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" event={"ID":"82e70dc1-5983-4831-9b27-9771974d4f47","Type":"ContainerStarted","Data":"27276ba2bb5421a04e98c8d5e9d92844770500482740f063fe3cdd6fefa2fad4"} Mar 13 01:40:22.747829 master-0 kubenswrapper[19170]: I0313 01:40:22.747805 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" event={"ID":"5c56b3e6-e84b-4dae-8005-3d0af50aadfb","Type":"ContainerStarted","Data":"13c279ff108824545de585393315010a564d63363b69b3b76dd918589a4fac87"} Mar 13 01:40:22.749416 master-0 kubenswrapper[19170]: I0313 01:40:22.749377 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" event={"ID":"28e38789-7aec-4807-bab0-f2cf3f316573","Type":"ContainerStarted","Data":"48e1251ba5205bf66b733fb7b8585983fbb07644c54b5d8c6046168b6ee9c67c"} Mar 13 01:40:22.751091 master-0 kubenswrapper[19170]: I0313 01:40:22.751040 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6494dc8c6b-x76zk" event={"ID":"eda172fc-916c-4817-815a-79dd06704d36","Type":"ContainerStarted","Data":"0ae4d910339161ba03609cb82ea85617a23ab77b00c93a3947bb63177e749ab4"} Mar 13 01:40:22.751091 master-0 kubenswrapper[19170]: I0313 01:40:22.751067 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6494dc8c6b-x76zk" event={"ID":"eda172fc-916c-4817-815a-79dd06704d36","Type":"ContainerStarted","Data":"92f4a3664266b73e310db58e131efaff43b4af10143c133b922c9ed5aa7559e7"} Mar 13 01:40:22.773890 master-0 kubenswrapper[19170]: I0313 01:40:22.773777 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6494dc8c6b-x76zk" podStartSLOduration=1.773757349 podStartE2EDuration="1.773757349s" podCreationTimestamp="2026-03-13 01:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:40:22.771470694 +0000 UTC m=+1283.579591664" watchObservedRunningTime="2026-03-13 01:40:22.773757349 +0000 UTC m=+1283.581878309" Mar 13 01:40:23.765738 master-0 kubenswrapper[19170]: I0313 01:40:23.765622 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qx8lb" event={"ID":"1341d719-fa4e-4ac5-b0ab-536f13b8e3e1","Type":"ContainerStarted","Data":"5bef7f16a47d9cb99fbc29353722b98beda1a53fa8fd4f5d6d371022cc9ce02b"} Mar 13 01:40:23.766924 master-0 kubenswrapper[19170]: I0313 01:40:23.765771 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qx8lb" Mar 13 01:40:23.768509 master-0 kubenswrapper[19170]: I0313 01:40:23.768477 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-667wg" event={"ID":"fd247073-2d90-4297-b745-d3b906c5f27d","Type":"ContainerStarted","Data":"15c3f1d376f4b89df8340daaa700aec76d0d5afa8697d610592bf1fbf5313e95"} Mar 13 01:40:24.009562 master-0 kubenswrapper[19170]: I0313 01:40:24.009490 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qx8lb" podStartSLOduration=3.877886552 podStartE2EDuration="6.009469465s" podCreationTimestamp="2026-03-13 01:40:18 +0000 UTC" firstStartedPulling="2026-03-13 01:40:21.109148348 +0000 UTC m=+1281.917269308" lastFinishedPulling="2026-03-13 01:40:23.240731261 +0000 UTC m=+1284.048852221" observedRunningTime="2026-03-13 01:40:24.005850036 +0000 UTC m=+1284.813970996" watchObservedRunningTime="2026-03-13 01:40:24.009469465 +0000 UTC m=+1284.817590415" Mar 13 01:40:24.778419 master-0 kubenswrapper[19170]: I0313 01:40:24.778343 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:29.283127 master-0 kubenswrapper[19170]: I0313 01:40:29.283060 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-667wg" Mar 13 01:40:29.301036 master-0 kubenswrapper[19170]: I0313 01:40:29.300962 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-667wg" podStartSLOduration=7.986540687 podStartE2EDuration="11.300945914s" podCreationTimestamp="2026-03-13 01:40:18 +0000 UTC" firstStartedPulling="2026-03-13 01:40:19.918436876 +0000 UTC m=+1280.726557846" lastFinishedPulling="2026-03-13 01:40:23.232842113 +0000 UTC m=+1284.040963073" observedRunningTime="2026-03-13 01:40:24.02795354 +0000 UTC m=+1284.836074500" watchObservedRunningTime="2026-03-13 01:40:29.300945914 +0000 UTC m=+1290.109066874" Mar 13 01:40:29.853428 master-0 kubenswrapper[19170]: I0313 01:40:29.853273 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" event={"ID":"d82ad9fb-f003-4e1d-a30a-f029eba41ea9","Type":"ContainerStarted","Data":"bd3ee23500eaec25c1f7a3197f9bfabadad897c8f5c37b8d3b7708b1559caada"} Mar 13 01:40:29.853428 master-0 kubenswrapper[19170]: I0313 01:40:29.853348 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:29.857227 master-0 kubenswrapper[19170]: I0313 01:40:29.857177 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" event={"ID":"5c56b3e6-e84b-4dae-8005-3d0af50aadfb","Type":"ContainerStarted","Data":"8883e9522e308ae211280c826e7462262010a0f4137a10848fff1d2c4bb7ebd2"} Mar 13 01:40:29.857227 master-0 kubenswrapper[19170]: I0313 01:40:29.857230 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" event={"ID":"5c56b3e6-e84b-4dae-8005-3d0af50aadfb","Type":"ContainerStarted","Data":"73e4fe0a0c4373fb9dd3ad404f6044eb276fd1840db3254397d3901f9fe61d00"} Mar 13 01:40:29.859238 master-0 kubenswrapper[19170]: I0313 01:40:29.859171 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" event={"ID":"28e38789-7aec-4807-bab0-f2cf3f316573","Type":"ContainerStarted","Data":"35cea134fe8918415e59a06c49a6a4fc5c0cb7f0e90b7cb7afc4e54bf0c8f26b"} Mar 13 01:40:29.859401 master-0 kubenswrapper[19170]: I0313 01:40:29.859362 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:29.865629 master-0 kubenswrapper[19170]: I0313 01:40:29.864398 19170 generic.go:334] "Generic (PLEG): container finished" podID="1ef92c3a-7b62-42e8-909b-1cadf7157035" containerID="c5e4c0d0573b0d259e5dff452c329ff5d6e70872e6798a2ee01c6aba424c542f" exitCode=0 Mar 13 01:40:29.865629 master-0 kubenswrapper[19170]: I0313 01:40:29.864498 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerDied","Data":"c5e4c0d0573b0d259e5dff452c329ff5d6e70872e6798a2ee01c6aba424c542f"} Mar 13 01:40:29.866913 master-0 kubenswrapper[19170]: I0313 01:40:29.866866 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-72q4d" event={"ID":"87f190ab-00ff-47fd-8392-3185fc8bab6f","Type":"ContainerStarted","Data":"3d7d459bb7e5b9dedc166c914e125c0a239b772d6b2b8deefc63b772f31a7715"} Mar 13 01:40:29.867391 master-0 kubenswrapper[19170]: I0313 01:40:29.867343 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:29.869850 master-0 kubenswrapper[19170]: I0313 01:40:29.869780 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" event={"ID":"82e70dc1-5983-4831-9b27-9771974d4f47","Type":"ContainerStarted","Data":"12cef64a2495590b0a5eae8aef273eaebbb2b8b5f874775154c400717524e8a3"} Mar 13 01:40:29.883092 master-0 kubenswrapper[19170]: I0313 01:40:29.882981 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" podStartSLOduration=1.848663452 podStartE2EDuration="11.882960953s" podCreationTimestamp="2026-03-13 01:40:18 +0000 UTC" firstStartedPulling="2026-03-13 01:40:19.562286494 +0000 UTC m=+1280.370407454" lastFinishedPulling="2026-03-13 01:40:29.596583985 +0000 UTC m=+1290.404704955" observedRunningTime="2026-03-13 01:40:29.873767282 +0000 UTC m=+1290.681888252" watchObservedRunningTime="2026-03-13 01:40:29.882960953 +0000 UTC m=+1290.691081913" Mar 13 01:40:29.908334 master-0 kubenswrapper[19170]: I0313 01:40:29.908244 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-72q4d" podStartSLOduration=2.700585032 podStartE2EDuration="9.90821996s" podCreationTimestamp="2026-03-13 01:40:20 +0000 UTC" firstStartedPulling="2026-03-13 01:40:21.42497249 +0000 UTC m=+1282.233093450" lastFinishedPulling="2026-03-13 01:40:28.632607418 +0000 UTC m=+1289.440728378" observedRunningTime="2026-03-13 01:40:29.899075101 +0000 UTC m=+1290.707196061" watchObservedRunningTime="2026-03-13 01:40:29.90821996 +0000 UTC m=+1290.716340910" Mar 13 01:40:29.944632 master-0 kubenswrapper[19170]: I0313 01:40:29.944546 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-g2t7x" podStartSLOduration=3.160356749 podStartE2EDuration="9.944525739s" podCreationTimestamp="2026-03-13 01:40:20 +0000 UTC" firstStartedPulling="2026-03-13 01:40:21.840139536 +0000 UTC m=+1282.648260496" lastFinishedPulling="2026-03-13 01:40:28.624308516 +0000 UTC m=+1289.432429486" observedRunningTime="2026-03-13 01:40:29.937969194 +0000 UTC m=+1290.746090164" watchObservedRunningTime="2026-03-13 01:40:29.944525739 +0000 UTC m=+1290.752646699" Mar 13 01:40:29.969938 master-0 kubenswrapper[19170]: I0313 01:40:29.969855 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" podStartSLOduration=3.258585466 podStartE2EDuration="9.969836428s" podCreationTimestamp="2026-03-13 01:40:20 +0000 UTC" firstStartedPulling="2026-03-13 01:40:21.913039133 +0000 UTC m=+1282.721160093" lastFinishedPulling="2026-03-13 01:40:28.624290095 +0000 UTC m=+1289.432411055" observedRunningTime="2026-03-13 01:40:29.962069404 +0000 UTC m=+1290.770190364" watchObservedRunningTime="2026-03-13 01:40:29.969836428 +0000 UTC m=+1290.777957388" Mar 13 01:40:30.031494 master-0 kubenswrapper[19170]: I0313 01:40:30.031415 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-rcf6z" podStartSLOduration=2.409611042 podStartE2EDuration="9.031395344s" podCreationTimestamp="2026-03-13 01:40:21 +0000 UTC" firstStartedPulling="2026-03-13 01:40:22.010423362 +0000 UTC m=+1282.818544322" lastFinishedPulling="2026-03-13 01:40:28.632207654 +0000 UTC m=+1289.440328624" observedRunningTime="2026-03-13 01:40:29.992823141 +0000 UTC m=+1290.800944121" watchObservedRunningTime="2026-03-13 01:40:30.031395344 +0000 UTC m=+1290.839516314" Mar 13 01:40:30.880139 master-0 kubenswrapper[19170]: I0313 01:40:30.880067 19170 generic.go:334] "Generic (PLEG): container finished" podID="1ef92c3a-7b62-42e8-909b-1cadf7157035" containerID="680bab52f4c3333495f9217d3b55a66229658fa66d99e5020c98919af275a4bf" exitCode=0 Mar 13 01:40:30.881170 master-0 kubenswrapper[19170]: I0313 01:40:30.880223 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerDied","Data":"680bab52f4c3333495f9217d3b55a66229658fa66d99e5020c98919af275a4bf"} Mar 13 01:40:31.659957 master-0 kubenswrapper[19170]: I0313 01:40:31.659894 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:31.659957 master-0 kubenswrapper[19170]: I0313 01:40:31.659962 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:31.672745 master-0 kubenswrapper[19170]: I0313 01:40:31.672676 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:31.892903 master-0 kubenswrapper[19170]: I0313 01:40:31.892852 19170 generic.go:334] "Generic (PLEG): container finished" podID="1ef92c3a-7b62-42e8-909b-1cadf7157035" containerID="d344cce7e8389fdeb32c3f49c111a54cc5f762aeca0ae7248f9bfe4ba9f5164c" exitCode=0 Mar 13 01:40:31.893542 master-0 kubenswrapper[19170]: I0313 01:40:31.892975 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerDied","Data":"d344cce7e8389fdeb32c3f49c111a54cc5f762aeca0ae7248f9bfe4ba9f5164c"} Mar 13 01:40:31.899749 master-0 kubenswrapper[19170]: I0313 01:40:31.899717 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6494dc8c6b-x76zk" Mar 13 01:40:32.033814 master-0 kubenswrapper[19170]: I0313 01:40:32.033759 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f9c97b86b-w5fxw"] Mar 13 01:40:32.913823 master-0 kubenswrapper[19170]: I0313 01:40:32.913718 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"39aff0adf964acac0755bf79e3d46d043444f980e61ec3ef9d96aab434a47f7b"} Mar 13 01:40:32.914678 master-0 kubenswrapper[19170]: I0313 01:40:32.913923 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"74866bc335b0d6ea81d6d09096f2a3c0ae7d6af550d5b71911e7229e0735e5a7"} Mar 13 01:40:32.914678 master-0 kubenswrapper[19170]: I0313 01:40:32.914078 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"5500144737d897b6e66d129ee1aa0214d0a9ca55910ed8d2f87379ba2da954a8"} Mar 13 01:40:32.914678 master-0 kubenswrapper[19170]: I0313 01:40:32.914123 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"e9d1b738845aa630379e94affb373bbb6c0161fcb1cfaf931cbb876121f4043d"} Mar 13 01:40:32.914678 master-0 kubenswrapper[19170]: I0313 01:40:32.914154 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"15509d1936743b868366a08a90f1d5f75149929210a84f2a47b3f26026d98f45"} Mar 13 01:40:33.936821 master-0 kubenswrapper[19170]: I0313 01:40:33.936726 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-pfmr9" event={"ID":"1ef92c3a-7b62-42e8-909b-1cadf7157035","Type":"ContainerStarted","Data":"f6ca2ebe16c0dde5239aebb5abad7a007f33e476372436a22ed0d41e7e16224c"} Mar 13 01:40:33.937972 master-0 kubenswrapper[19170]: I0313 01:40:33.937031 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:33.984033 master-0 kubenswrapper[19170]: I0313 01:40:33.983918 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-pfmr9" podStartSLOduration=6.6938962889999996 podStartE2EDuration="15.983890326s" podCreationTimestamp="2026-03-13 01:40:18 +0000 UTC" firstStartedPulling="2026-03-13 01:40:19.383314823 +0000 UTC m=+1280.191435773" lastFinishedPulling="2026-03-13 01:40:28.67330885 +0000 UTC m=+1289.481429810" observedRunningTime="2026-03-13 01:40:33.97454475 +0000 UTC m=+1294.782665780" watchObservedRunningTime="2026-03-13 01:40:33.983890326 +0000 UTC m=+1294.792011316" Mar 13 01:40:34.195760 master-0 kubenswrapper[19170]: I0313 01:40:34.195560 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:34.258956 master-0 kubenswrapper[19170]: I0313 01:40:34.258890 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:36.418606 master-0 kubenswrapper[19170]: I0313 01:40:36.418542 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-72q4d" Mar 13 01:40:39.111497 master-0 kubenswrapper[19170]: I0313 01:40:39.111461 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wqmlj" Mar 13 01:40:40.751352 master-0 kubenswrapper[19170]: I0313 01:40:40.751300 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qx8lb" Mar 13 01:40:41.299566 master-0 kubenswrapper[19170]: I0313 01:40:41.299479 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-mgg76" Mar 13 01:40:46.537499 master-0 kubenswrapper[19170]: I0313 01:40:46.537430 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-2xbbc"] Mar 13 01:40:46.539228 master-0 kubenswrapper[19170]: I0313 01:40:46.539181 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.554560 master-0 kubenswrapper[19170]: I0313 01:40:46.554494 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 13 01:40:46.555954 master-0 kubenswrapper[19170]: I0313 01:40:46.555897 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-2xbbc"] Mar 13 01:40:46.661568 master-0 kubenswrapper[19170]: I0313 01:40:46.661502 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-file-lock-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661568 master-0 kubenswrapper[19170]: I0313 01:40:46.661562 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-lvmd-config\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661613 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-csi-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661654 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-registration-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661685 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-pod-volumes-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661719 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6a162b-423c-4701-b52e-7c94383377b3-metrics-cert\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661745 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-device-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661779 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mvlk\" (UniqueName: \"kubernetes.io/projected/fb6a162b-423c-4701-b52e-7c94383377b3-kube-api-access-9mvlk\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661821 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-run-udev\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661854 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-node-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.661904 master-0 kubenswrapper[19170]: I0313 01:40:46.661884 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-sys\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.763655 master-0 kubenswrapper[19170]: I0313 01:40:46.763572 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-sys\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.763904 master-0 kubenswrapper[19170]: I0313 01:40:46.763744 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-file-lock-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.763904 master-0 kubenswrapper[19170]: I0313 01:40:46.763750 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-sys\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.763904 master-0 kubenswrapper[19170]: I0313 01:40:46.763784 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-lvmd-config\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764024 master-0 kubenswrapper[19170]: I0313 01:40:46.763910 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-csi-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764024 master-0 kubenswrapper[19170]: I0313 01:40:46.763948 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-registration-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764024 master-0 kubenswrapper[19170]: I0313 01:40:46.763989 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-pod-volumes-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764112 master-0 kubenswrapper[19170]: I0313 01:40:46.764077 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-lvmd-config\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764210 master-0 kubenswrapper[19170]: I0313 01:40:46.764173 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-file-lock-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764254 master-0 kubenswrapper[19170]: I0313 01:40:46.764182 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-registration-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764254 master-0 kubenswrapper[19170]: I0313 01:40:46.764087 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6a162b-423c-4701-b52e-7c94383377b3-metrics-cert\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764316 master-0 kubenswrapper[19170]: I0313 01:40:46.764270 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-device-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764316 master-0 kubenswrapper[19170]: I0313 01:40:46.764269 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-csi-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764379 master-0 kubenswrapper[19170]: I0313 01:40:46.764321 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mvlk\" (UniqueName: \"kubernetes.io/projected/fb6a162b-423c-4701-b52e-7c94383377b3-kube-api-access-9mvlk\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764379 master-0 kubenswrapper[19170]: I0313 01:40:46.764204 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-pod-volumes-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764446 master-0 kubenswrapper[19170]: I0313 01:40:46.764392 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-device-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764490 master-0 kubenswrapper[19170]: I0313 01:40:46.764466 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-run-udev\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764568 master-0 kubenswrapper[19170]: I0313 01:40:46.764538 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-node-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764744 master-0 kubenswrapper[19170]: I0313 01:40:46.764693 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-run-udev\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.764865 master-0 kubenswrapper[19170]: I0313 01:40:46.764839 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/fb6a162b-423c-4701-b52e-7c94383377b3-node-plugin-dir\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.768163 master-0 kubenswrapper[19170]: I0313 01:40:46.768109 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb6a162b-423c-4701-b52e-7c94383377b3-metrics-cert\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.797605 master-0 kubenswrapper[19170]: I0313 01:40:46.797466 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mvlk\" (UniqueName: \"kubernetes.io/projected/fb6a162b-423c-4701-b52e-7c94383377b3-kube-api-access-9mvlk\") pod \"vg-manager-2xbbc\" (UID: \"fb6a162b-423c-4701-b52e-7c94383377b3\") " pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:46.913417 master-0 kubenswrapper[19170]: I0313 01:40:46.913350 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:47.435435 master-0 kubenswrapper[19170]: I0313 01:40:47.435397 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-2xbbc"] Mar 13 01:40:47.442255 master-0 kubenswrapper[19170]: W0313 01:40:47.442216 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6a162b_423c_4701_b52e_7c94383377b3.slice/crio-925cbdca7b8e89bd156f7c8a0944df56aded2d4c16b9e00146a5f6e7088a0b25 WatchSource:0}: Error finding container 925cbdca7b8e89bd156f7c8a0944df56aded2d4c16b9e00146a5f6e7088a0b25: Status 404 returned error can't find the container with id 925cbdca7b8e89bd156f7c8a0944df56aded2d4c16b9e00146a5f6e7088a0b25 Mar 13 01:40:48.081784 master-0 kubenswrapper[19170]: I0313 01:40:48.081728 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-2xbbc" event={"ID":"fb6a162b-423c-4701-b52e-7c94383377b3","Type":"ContainerStarted","Data":"b1f7ebba10f56833e651fd8a11a5dfc908d159e7483d13fc5c0df15959bbe9c6"} Mar 13 01:40:48.081784 master-0 kubenswrapper[19170]: I0313 01:40:48.081787 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-2xbbc" event={"ID":"fb6a162b-423c-4701-b52e-7c94383377b3","Type":"ContainerStarted","Data":"925cbdca7b8e89bd156f7c8a0944df56aded2d4c16b9e00146a5f6e7088a0b25"} Mar 13 01:40:48.104968 master-0 kubenswrapper[19170]: I0313 01:40:48.104895 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-2xbbc" podStartSLOduration=2.104877833 podStartE2EDuration="2.104877833s" podCreationTimestamp="2026-03-13 01:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:40:48.099781616 +0000 UTC m=+1308.907902576" watchObservedRunningTime="2026-03-13 01:40:48.104877833 +0000 UTC m=+1308.912998783" Mar 13 01:40:49.207215 master-0 kubenswrapper[19170]: I0313 01:40:49.207153 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-pfmr9" Mar 13 01:40:50.114467 master-0 kubenswrapper[19170]: I0313 01:40:50.114394 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-2xbbc_fb6a162b-423c-4701-b52e-7c94383377b3/vg-manager/0.log" Mar 13 01:40:50.114878 master-0 kubenswrapper[19170]: I0313 01:40:50.114509 19170 generic.go:334] "Generic (PLEG): container finished" podID="fb6a162b-423c-4701-b52e-7c94383377b3" containerID="b1f7ebba10f56833e651fd8a11a5dfc908d159e7483d13fc5c0df15959bbe9c6" exitCode=1 Mar 13 01:40:50.114878 master-0 kubenswrapper[19170]: I0313 01:40:50.114557 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-2xbbc" event={"ID":"fb6a162b-423c-4701-b52e-7c94383377b3","Type":"ContainerDied","Data":"b1f7ebba10f56833e651fd8a11a5dfc908d159e7483d13fc5c0df15959bbe9c6"} Mar 13 01:40:50.115371 master-0 kubenswrapper[19170]: I0313 01:40:50.115326 19170 scope.go:117] "RemoveContainer" containerID="b1f7ebba10f56833e651fd8a11a5dfc908d159e7483d13fc5c0df15959bbe9c6" Mar 13 01:40:50.522508 master-0 kubenswrapper[19170]: I0313 01:40:50.522464 19170 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 13 01:40:51.124861 master-0 kubenswrapper[19170]: I0313 01:40:51.124128 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-2xbbc_fb6a162b-423c-4701-b52e-7c94383377b3/vg-manager/0.log" Mar 13 01:40:51.124861 master-0 kubenswrapper[19170]: I0313 01:40:51.124195 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-2xbbc" event={"ID":"fb6a162b-423c-4701-b52e-7c94383377b3","Type":"ContainerStarted","Data":"c10c46b4441e297f98baddbd83015e3035f5957d44710b761fbd41ac1c73fbf1"} Mar 13 01:40:51.174927 master-0 kubenswrapper[19170]: I0313 01:40:51.174604 19170 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-13T01:40:50.523296119Z","Handler":null,"Name":""} Mar 13 01:40:51.177790 master-0 kubenswrapper[19170]: I0313 01:40:51.177619 19170 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 13 01:40:51.177790 master-0 kubenswrapper[19170]: I0313 01:40:51.177692 19170 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 13 01:40:56.914303 master-0 kubenswrapper[19170]: I0313 01:40:56.914209 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:56.918345 master-0 kubenswrapper[19170]: I0313 01:40:56.918240 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:57.083941 master-0 kubenswrapper[19170]: I0313 01:40:57.083868 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5f9c97b86b-w5fxw" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" containerID="cri-o://6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566" gracePeriod=15 Mar 13 01:40:57.185835 master-0 kubenswrapper[19170]: I0313 01:40:57.185673 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:57.186957 master-0 kubenswrapper[19170]: I0313 01:40:57.186921 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-2xbbc" Mar 13 01:40:57.669536 master-0 kubenswrapper[19170]: I0313 01:40:57.669464 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f9c97b86b-w5fxw_bc107bad-0393-441c-9815-09f27f25888c/console/0.log" Mar 13 01:40:57.669759 master-0 kubenswrapper[19170]: I0313 01:40:57.669592 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:40:57.813458 master-0 kubenswrapper[19170]: I0313 01:40:57.813376 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcsxq\" (UniqueName: \"kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.813676 master-0 kubenswrapper[19170]: I0313 01:40:57.813464 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.813676 master-0 kubenswrapper[19170]: I0313 01:40:57.813515 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.813676 master-0 kubenswrapper[19170]: I0313 01:40:57.813614 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.813799 master-0 kubenswrapper[19170]: I0313 01:40:57.813703 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.813866 master-0 kubenswrapper[19170]: I0313 01:40:57.813829 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.814465 master-0 kubenswrapper[19170]: I0313 01:40:57.814411 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:40:57.815053 master-0 kubenswrapper[19170]: I0313 01:40:57.814990 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config" (OuterVolumeSpecName: "console-config") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:40:57.815177 master-0 kubenswrapper[19170]: I0313 01:40:57.815117 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:40:57.815328 master-0 kubenswrapper[19170]: I0313 01:40:57.815278 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:40:57.815500 master-0 kubenswrapper[19170]: I0313 01:40:57.815470 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config\") pod \"bc107bad-0393-441c-9815-09f27f25888c\" (UID: \"bc107bad-0393-441c-9815-09f27f25888c\") " Mar 13 01:40:57.816755 master-0 kubenswrapper[19170]: I0313 01:40:57.816670 19170 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.816755 master-0 kubenswrapper[19170]: I0313 01:40:57.816709 19170 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.816755 master-0 kubenswrapper[19170]: I0313 01:40:57.816728 19170 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.816755 master-0 kubenswrapper[19170]: I0313 01:40:57.816751 19170 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc107bad-0393-441c-9815-09f27f25888c-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.817837 master-0 kubenswrapper[19170]: I0313 01:40:57.817775 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:40:57.820214 master-0 kubenswrapper[19170]: I0313 01:40:57.820165 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:40:57.820892 master-0 kubenswrapper[19170]: I0313 01:40:57.820841 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq" (OuterVolumeSpecName: "kube-api-access-tcsxq") pod "bc107bad-0393-441c-9815-09f27f25888c" (UID: "bc107bad-0393-441c-9815-09f27f25888c"). InnerVolumeSpecName "kube-api-access-tcsxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:40:57.918771 master-0 kubenswrapper[19170]: I0313 01:40:57.918692 19170 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.918771 master-0 kubenswrapper[19170]: I0313 01:40:57.918753 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcsxq\" (UniqueName: \"kubernetes.io/projected/bc107bad-0393-441c-9815-09f27f25888c-kube-api-access-tcsxq\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:57.918771 master-0 kubenswrapper[19170]: I0313 01:40:57.918773 19170 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc107bad-0393-441c-9815-09f27f25888c-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 01:40:58.200595 master-0 kubenswrapper[19170]: I0313 01:40:58.200517 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f9c97b86b-w5fxw_bc107bad-0393-441c-9815-09f27f25888c/console/0.log" Mar 13 01:40:58.200891 master-0 kubenswrapper[19170]: I0313 01:40:58.200668 19170 generic.go:334] "Generic (PLEG): container finished" podID="bc107bad-0393-441c-9815-09f27f25888c" containerID="6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566" exitCode=2 Mar 13 01:40:58.200891 master-0 kubenswrapper[19170]: I0313 01:40:58.200837 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9c97b86b-w5fxw" event={"ID":"bc107bad-0393-441c-9815-09f27f25888c","Type":"ContainerDied","Data":"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566"} Mar 13 01:40:58.201001 master-0 kubenswrapper[19170]: I0313 01:40:58.200897 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9c97b86b-w5fxw" event={"ID":"bc107bad-0393-441c-9815-09f27f25888c","Type":"ContainerDied","Data":"7828912a4b410f6e3e56c4797442ca59575620dabdb98ea0cf4e79ec90797d9e"} Mar 13 01:40:58.201001 master-0 kubenswrapper[19170]: I0313 01:40:58.200888 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9c97b86b-w5fxw" Mar 13 01:40:58.201090 master-0 kubenswrapper[19170]: I0313 01:40:58.200943 19170 scope.go:117] "RemoveContainer" containerID="6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566" Mar 13 01:40:58.241754 master-0 kubenswrapper[19170]: I0313 01:40:58.241684 19170 scope.go:117] "RemoveContainer" containerID="6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566" Mar 13 01:40:58.242501 master-0 kubenswrapper[19170]: E0313 01:40:58.242367 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566\": container with ID starting with 6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566 not found: ID does not exist" containerID="6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566" Mar 13 01:40:58.242624 master-0 kubenswrapper[19170]: I0313 01:40:58.242549 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566"} err="failed to get container status \"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566\": rpc error: code = NotFound desc = could not find container \"6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566\": container with ID starting with 6fb5687cf3d8c7503507bd917060ca214ad6ad948c7e54ae535b31c656876566 not found: ID does not exist" Mar 13 01:40:58.280604 master-0 kubenswrapper[19170]: I0313 01:40:58.277391 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f9c97b86b-w5fxw"] Mar 13 01:40:58.288681 master-0 kubenswrapper[19170]: I0313 01:40:58.287778 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f9c97b86b-w5fxw"] Mar 13 01:40:59.376051 master-0 kubenswrapper[19170]: I0313 01:40:59.375982 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:40:59.376680 master-0 kubenswrapper[19170]: E0313 01:40:59.376339 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" Mar 13 01:40:59.376680 master-0 kubenswrapper[19170]: I0313 01:40:59.376354 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" Mar 13 01:40:59.376680 master-0 kubenswrapper[19170]: I0313 01:40:59.376575 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc107bad-0393-441c-9815-09f27f25888c" containerName="console" Mar 13 01:40:59.377263 master-0 kubenswrapper[19170]: I0313 01:40:59.377234 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:40:59.378771 master-0 kubenswrapper[19170]: I0313 01:40:59.378730 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 01:40:59.383356 master-0 kubenswrapper[19170]: I0313 01:40:59.383303 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 01:40:59.385342 master-0 kubenswrapper[19170]: I0313 01:40:59.385259 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tjl\" (UniqueName: \"kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl\") pod \"openstack-operator-index-v9pfv\" (UID: \"4699d0fc-d3da-41da-804f-5c1762a556bd\") " pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:40:59.451438 master-0 kubenswrapper[19170]: I0313 01:40:59.450458 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc107bad-0393-441c-9815-09f27f25888c" path="/var/lib/kubelet/pods/bc107bad-0393-441c-9815-09f27f25888c/volumes" Mar 13 01:40:59.452476 master-0 kubenswrapper[19170]: I0313 01:40:59.452400 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:40:59.486559 master-0 kubenswrapper[19170]: I0313 01:40:59.486513 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tjl\" (UniqueName: \"kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl\") pod \"openstack-operator-index-v9pfv\" (UID: \"4699d0fc-d3da-41da-804f-5c1762a556bd\") " pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:40:59.513581 master-0 kubenswrapper[19170]: I0313 01:40:59.513523 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tjl\" (UniqueName: \"kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl\") pod \"openstack-operator-index-v9pfv\" (UID: \"4699d0fc-d3da-41da-804f-5c1762a556bd\") " pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:40:59.703887 master-0 kubenswrapper[19170]: I0313 01:40:59.703817 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:41:00.142121 master-0 kubenswrapper[19170]: I0313 01:41:00.141954 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:41:00.237500 master-0 kubenswrapper[19170]: I0313 01:41:00.237417 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pfv" event={"ID":"4699d0fc-d3da-41da-804f-5c1762a556bd","Type":"ContainerStarted","Data":"60a7bb7bc3dfda3f12aa0943fe8f85a2014734e5fefefa5809bfb9126d68defe"} Mar 13 01:41:02.266456 master-0 kubenswrapper[19170]: I0313 01:41:02.266370 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pfv" event={"ID":"4699d0fc-d3da-41da-804f-5c1762a556bd","Type":"ContainerStarted","Data":"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57"} Mar 13 01:41:02.300424 master-0 kubenswrapper[19170]: I0313 01:41:02.300308 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v9pfv" podStartSLOduration=2.399933134 podStartE2EDuration="3.300280588s" podCreationTimestamp="2026-03-13 01:40:59 +0000 UTC" firstStartedPulling="2026-03-13 01:41:00.157839619 +0000 UTC m=+1320.965960609" lastFinishedPulling="2026-03-13 01:41:01.058187103 +0000 UTC m=+1321.866308063" observedRunningTime="2026-03-13 01:41:02.289556587 +0000 UTC m=+1323.097677577" watchObservedRunningTime="2026-03-13 01:41:02.300280588 +0000 UTC m=+1323.108401558" Mar 13 01:41:03.346904 master-0 kubenswrapper[19170]: I0313 01:41:03.346728 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:41:03.965044 master-0 kubenswrapper[19170]: I0313 01:41:03.964913 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nqdp6"] Mar 13 01:41:03.967170 master-0 kubenswrapper[19170]: I0313 01:41:03.967118 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:03.989261 master-0 kubenswrapper[19170]: I0313 01:41:03.989118 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nqdp6"] Mar 13 01:41:04.079762 master-0 kubenswrapper[19170]: I0313 01:41:04.079679 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmsb\" (UniqueName: \"kubernetes.io/projected/ee7a0f56-3758-4091-96f0-fe82e7bcc21f-kube-api-access-7xmsb\") pod \"openstack-operator-index-nqdp6\" (UID: \"ee7a0f56-3758-4091-96f0-fe82e7bcc21f\") " pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:04.182152 master-0 kubenswrapper[19170]: I0313 01:41:04.182057 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmsb\" (UniqueName: \"kubernetes.io/projected/ee7a0f56-3758-4091-96f0-fe82e7bcc21f-kube-api-access-7xmsb\") pod \"openstack-operator-index-nqdp6\" (UID: \"ee7a0f56-3758-4091-96f0-fe82e7bcc21f\") " pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:04.206817 master-0 kubenswrapper[19170]: I0313 01:41:04.206749 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmsb\" (UniqueName: \"kubernetes.io/projected/ee7a0f56-3758-4091-96f0-fe82e7bcc21f-kube-api-access-7xmsb\") pod \"openstack-operator-index-nqdp6\" (UID: \"ee7a0f56-3758-4091-96f0-fe82e7bcc21f\") " pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:04.288206 master-0 kubenswrapper[19170]: I0313 01:41:04.288057 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v9pfv" podUID="4699d0fc-d3da-41da-804f-5c1762a556bd" containerName="registry-server" containerID="cri-o://9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57" gracePeriod=2 Mar 13 01:41:04.344102 master-0 kubenswrapper[19170]: I0313 01:41:04.344033 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:04.837002 master-0 kubenswrapper[19170]: I0313 01:41:04.836892 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:41:04.917026 master-0 kubenswrapper[19170]: I0313 01:41:04.916915 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tjl\" (UniqueName: \"kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl\") pod \"4699d0fc-d3da-41da-804f-5c1762a556bd\" (UID: \"4699d0fc-d3da-41da-804f-5c1762a556bd\") " Mar 13 01:41:04.920795 master-0 kubenswrapper[19170]: I0313 01:41:04.920724 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl" (OuterVolumeSpecName: "kube-api-access-95tjl") pod "4699d0fc-d3da-41da-804f-5c1762a556bd" (UID: "4699d0fc-d3da-41da-804f-5c1762a556bd"). InnerVolumeSpecName "kube-api-access-95tjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:41:04.925722 master-0 kubenswrapper[19170]: I0313 01:41:04.925506 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nqdp6"] Mar 13 01:41:05.018809 master-0 kubenswrapper[19170]: I0313 01:41:05.018755 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tjl\" (UniqueName: \"kubernetes.io/projected/4699d0fc-d3da-41da-804f-5c1762a556bd-kube-api-access-95tjl\") on node \"master-0\" DevicePath \"\"" Mar 13 01:41:05.297127 master-0 kubenswrapper[19170]: I0313 01:41:05.297059 19170 generic.go:334] "Generic (PLEG): container finished" podID="4699d0fc-d3da-41da-804f-5c1762a556bd" containerID="9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57" exitCode=0 Mar 13 01:41:05.297321 master-0 kubenswrapper[19170]: I0313 01:41:05.297137 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pfv" event={"ID":"4699d0fc-d3da-41da-804f-5c1762a556bd","Type":"ContainerDied","Data":"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57"} Mar 13 01:41:05.297321 master-0 kubenswrapper[19170]: I0313 01:41:05.297218 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v9pfv" event={"ID":"4699d0fc-d3da-41da-804f-5c1762a556bd","Type":"ContainerDied","Data":"60a7bb7bc3dfda3f12aa0943fe8f85a2014734e5fefefa5809bfb9126d68defe"} Mar 13 01:41:05.297321 master-0 kubenswrapper[19170]: I0313 01:41:05.297254 19170 scope.go:117] "RemoveContainer" containerID="9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57" Mar 13 01:41:05.297321 master-0 kubenswrapper[19170]: I0313 01:41:05.297153 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v9pfv" Mar 13 01:41:05.299413 master-0 kubenswrapper[19170]: I0313 01:41:05.298759 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nqdp6" event={"ID":"ee7a0f56-3758-4091-96f0-fe82e7bcc21f","Type":"ContainerStarted","Data":"ee81e8f11ffce578d3dfdeca17650849cf620c5f246d6e915d0de7de519a655b"} Mar 13 01:41:05.315167 master-0 kubenswrapper[19170]: I0313 01:41:05.315113 19170 scope.go:117] "RemoveContainer" containerID="9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57" Mar 13 01:41:05.315853 master-0 kubenswrapper[19170]: E0313 01:41:05.315724 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57\": container with ID starting with 9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57 not found: ID does not exist" containerID="9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57" Mar 13 01:41:05.315853 master-0 kubenswrapper[19170]: I0313 01:41:05.315795 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57"} err="failed to get container status \"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57\": rpc error: code = NotFound desc = could not find container \"9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57\": container with ID starting with 9967505fb984e2ff9564cb12826f9b8454e647bb9e27494fe15342dcce0eba57 not found: ID does not exist" Mar 13 01:41:05.334440 master-0 kubenswrapper[19170]: I0313 01:41:05.334303 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:41:05.344257 master-0 kubenswrapper[19170]: I0313 01:41:05.344123 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v9pfv"] Mar 13 01:41:05.435531 master-0 kubenswrapper[19170]: I0313 01:41:05.435453 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4699d0fc-d3da-41da-804f-5c1762a556bd" path="/var/lib/kubelet/pods/4699d0fc-d3da-41da-804f-5c1762a556bd/volumes" Mar 13 01:41:06.314617 master-0 kubenswrapper[19170]: I0313 01:41:06.314497 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nqdp6" event={"ID":"ee7a0f56-3758-4091-96f0-fe82e7bcc21f","Type":"ContainerStarted","Data":"fb5c8b803c18df8d508d91f1b4b738d0254909f01cafc582ef86914b87e4392e"} Mar 13 01:41:06.355225 master-0 kubenswrapper[19170]: I0313 01:41:06.355028 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nqdp6" podStartSLOduration=2.916033401 podStartE2EDuration="3.354992205s" podCreationTimestamp="2026-03-13 01:41:03 +0000 UTC" firstStartedPulling="2026-03-13 01:41:04.929168546 +0000 UTC m=+1325.737289526" lastFinishedPulling="2026-03-13 01:41:05.36812737 +0000 UTC m=+1326.176248330" observedRunningTime="2026-03-13 01:41:06.341515914 +0000 UTC m=+1327.149636904" watchObservedRunningTime="2026-03-13 01:41:06.354992205 +0000 UTC m=+1327.163113205" Mar 13 01:41:14.344616 master-0 kubenswrapper[19170]: I0313 01:41:14.344541 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:14.344616 master-0 kubenswrapper[19170]: I0313 01:41:14.344613 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:14.400670 master-0 kubenswrapper[19170]: I0313 01:41:14.397930 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:14.447322 master-0 kubenswrapper[19170]: I0313 01:41:14.447233 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-nqdp6" Mar 13 01:41:18.034668 master-0 kubenswrapper[19170]: I0313 01:41:18.032218 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx"] Mar 13 01:41:18.036922 master-0 kubenswrapper[19170]: E0313 01:41:18.034671 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4699d0fc-d3da-41da-804f-5c1762a556bd" containerName="registry-server" Mar 13 01:41:18.036922 master-0 kubenswrapper[19170]: I0313 01:41:18.034734 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4699d0fc-d3da-41da-804f-5c1762a556bd" containerName="registry-server" Mar 13 01:41:18.036922 master-0 kubenswrapper[19170]: I0313 01:41:18.036679 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4699d0fc-d3da-41da-804f-5c1762a556bd" containerName="registry-server" Mar 13 01:41:18.044224 master-0 kubenswrapper[19170]: I0313 01:41:18.044168 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.064936 master-0 kubenswrapper[19170]: I0313 01:41:18.064824 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.065164 master-0 kubenswrapper[19170]: I0313 01:41:18.064946 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vfgm\" (UniqueName: \"kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.065164 master-0 kubenswrapper[19170]: I0313 01:41:18.065089 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.086085 master-0 kubenswrapper[19170]: I0313 01:41:18.085560 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx"] Mar 13 01:41:18.166721 master-0 kubenswrapper[19170]: I0313 01:41:18.166622 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.166721 master-0 kubenswrapper[19170]: I0313 01:41:18.166702 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vfgm\" (UniqueName: \"kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.167000 master-0 kubenswrapper[19170]: I0313 01:41:18.166754 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.167301 master-0 kubenswrapper[19170]: I0313 01:41:18.167254 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.167301 master-0 kubenswrapper[19170]: I0313 01:41:18.167282 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.188964 master-0 kubenswrapper[19170]: I0313 01:41:18.188885 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vfgm\" (UniqueName: \"kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.390039 master-0 kubenswrapper[19170]: I0313 01:41:18.389833 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:18.913451 master-0 kubenswrapper[19170]: I0313 01:41:18.913272 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx"] Mar 13 01:41:18.923374 master-0 kubenswrapper[19170]: W0313 01:41:18.923276 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4909a040_a553_4d59_ad84_da6c38ab3acd.slice/crio-c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d WatchSource:0}: Error finding container c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d: Status 404 returned error can't find the container with id c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d Mar 13 01:41:19.468170 master-0 kubenswrapper[19170]: I0313 01:41:19.467248 19170 generic.go:334] "Generic (PLEG): container finished" podID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerID="c9d4c51f2c37ce64ff96fb5efe2024a7b4dd4435dcca4865d77d77d78b74ef6a" exitCode=0 Mar 13 01:41:19.468170 master-0 kubenswrapper[19170]: I0313 01:41:19.467335 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" event={"ID":"4909a040-a553-4d59-ad84-da6c38ab3acd","Type":"ContainerDied","Data":"c9d4c51f2c37ce64ff96fb5efe2024a7b4dd4435dcca4865d77d77d78b74ef6a"} Mar 13 01:41:19.468170 master-0 kubenswrapper[19170]: I0313 01:41:19.467391 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" event={"ID":"4909a040-a553-4d59-ad84-da6c38ab3acd","Type":"ContainerStarted","Data":"c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d"} Mar 13 01:41:21.516755 master-0 kubenswrapper[19170]: I0313 01:41:21.516620 19170 generic.go:334] "Generic (PLEG): container finished" podID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerID="2d06023ad1da72988b74610b615cd178a311ce3f852d4676013c907ce56ea4e5" exitCode=0 Mar 13 01:41:21.516755 master-0 kubenswrapper[19170]: I0313 01:41:21.516688 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" event={"ID":"4909a040-a553-4d59-ad84-da6c38ab3acd","Type":"ContainerDied","Data":"2d06023ad1da72988b74610b615cd178a311ce3f852d4676013c907ce56ea4e5"} Mar 13 01:41:22.530839 master-0 kubenswrapper[19170]: I0313 01:41:22.530766 19170 generic.go:334] "Generic (PLEG): container finished" podID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerID="e51660d33d9156f94f00fc2afe985d36e86824f54e1cb5225f81d38c5872e503" exitCode=0 Mar 13 01:41:22.530839 master-0 kubenswrapper[19170]: I0313 01:41:22.530823 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" event={"ID":"4909a040-a553-4d59-ad84-da6c38ab3acd","Type":"ContainerDied","Data":"e51660d33d9156f94f00fc2afe985d36e86824f54e1cb5225f81d38c5872e503"} Mar 13 01:41:24.011365 master-0 kubenswrapper[19170]: I0313 01:41:24.011299 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:24.082258 master-0 kubenswrapper[19170]: I0313 01:41:24.082103 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle\") pod \"4909a040-a553-4d59-ad84-da6c38ab3acd\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " Mar 13 01:41:24.083062 master-0 kubenswrapper[19170]: I0313 01:41:24.082520 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle" (OuterVolumeSpecName: "bundle") pod "4909a040-a553-4d59-ad84-da6c38ab3acd" (UID: "4909a040-a553-4d59-ad84-da6c38ab3acd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:41:24.083184 master-0 kubenswrapper[19170]: I0313 01:41:24.083108 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util\") pod \"4909a040-a553-4d59-ad84-da6c38ab3acd\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " Mar 13 01:41:24.083589 master-0 kubenswrapper[19170]: I0313 01:41:24.083527 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vfgm\" (UniqueName: \"kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm\") pod \"4909a040-a553-4d59-ad84-da6c38ab3acd\" (UID: \"4909a040-a553-4d59-ad84-da6c38ab3acd\") " Mar 13 01:41:24.085404 master-0 kubenswrapper[19170]: I0313 01:41:24.084776 19170 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:41:24.091039 master-0 kubenswrapper[19170]: I0313 01:41:24.090974 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm" (OuterVolumeSpecName: "kube-api-access-4vfgm") pod "4909a040-a553-4d59-ad84-da6c38ab3acd" (UID: "4909a040-a553-4d59-ad84-da6c38ab3acd"). InnerVolumeSpecName "kube-api-access-4vfgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:41:24.110105 master-0 kubenswrapper[19170]: I0313 01:41:24.110037 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util" (OuterVolumeSpecName: "util") pod "4909a040-a553-4d59-ad84-da6c38ab3acd" (UID: "4909a040-a553-4d59-ad84-da6c38ab3acd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:41:24.186719 master-0 kubenswrapper[19170]: I0313 01:41:24.186655 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vfgm\" (UniqueName: \"kubernetes.io/projected/4909a040-a553-4d59-ad84-da6c38ab3acd-kube-api-access-4vfgm\") on node \"master-0\" DevicePath \"\"" Mar 13 01:41:24.187060 master-0 kubenswrapper[19170]: I0313 01:41:24.187036 19170 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4909a040-a553-4d59-ad84-da6c38ab3acd-util\") on node \"master-0\" DevicePath \"\"" Mar 13 01:41:24.572603 master-0 kubenswrapper[19170]: I0313 01:41:24.572525 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" event={"ID":"4909a040-a553-4d59-ad84-da6c38ab3acd","Type":"ContainerDied","Data":"c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d"} Mar 13 01:41:24.572603 master-0 kubenswrapper[19170]: I0313 01:41:24.572591 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35c3391b02afac166dae59a829dfbd0d10ef64ade206d34cd72797e607b130d" Mar 13 01:41:24.573022 master-0 kubenswrapper[19170]: I0313 01:41:24.572673 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477vx5cx" Mar 13 01:41:30.663918 master-0 kubenswrapper[19170]: I0313 01:41:30.663871 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5"] Mar 13 01:41:30.665561 master-0 kubenswrapper[19170]: E0313 01:41:30.665532 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="util" Mar 13 01:41:30.665692 master-0 kubenswrapper[19170]: I0313 01:41:30.665679 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="util" Mar 13 01:41:30.665799 master-0 kubenswrapper[19170]: E0313 01:41:30.665789 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="extract" Mar 13 01:41:30.665859 master-0 kubenswrapper[19170]: I0313 01:41:30.665849 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="extract" Mar 13 01:41:30.665921 master-0 kubenswrapper[19170]: E0313 01:41:30.665912 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="pull" Mar 13 01:41:30.665975 master-0 kubenswrapper[19170]: I0313 01:41:30.665966 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="pull" Mar 13 01:41:30.666230 master-0 kubenswrapper[19170]: I0313 01:41:30.666212 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4909a040-a553-4d59-ad84-da6c38ab3acd" containerName="extract" Mar 13 01:41:30.666979 master-0 kubenswrapper[19170]: I0313 01:41:30.666959 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:30.698479 master-0 kubenswrapper[19170]: I0313 01:41:30.698428 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5"] Mar 13 01:41:30.749901 master-0 kubenswrapper[19170]: I0313 01:41:30.749847 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbslk\" (UniqueName: \"kubernetes.io/projected/83ee835e-29dc-46f5-b4ae-1b5770efb38a-kube-api-access-pbslk\") pod \"openstack-operator-controller-init-65b9994cf8-4rkk5\" (UID: \"83ee835e-29dc-46f5-b4ae-1b5770efb38a\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:30.852242 master-0 kubenswrapper[19170]: I0313 01:41:30.852147 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbslk\" (UniqueName: \"kubernetes.io/projected/83ee835e-29dc-46f5-b4ae-1b5770efb38a-kube-api-access-pbslk\") pod \"openstack-operator-controller-init-65b9994cf8-4rkk5\" (UID: \"83ee835e-29dc-46f5-b4ae-1b5770efb38a\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:30.880325 master-0 kubenswrapper[19170]: I0313 01:41:30.880278 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbslk\" (UniqueName: \"kubernetes.io/projected/83ee835e-29dc-46f5-b4ae-1b5770efb38a-kube-api-access-pbslk\") pod \"openstack-operator-controller-init-65b9994cf8-4rkk5\" (UID: \"83ee835e-29dc-46f5-b4ae-1b5770efb38a\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:30.989088 master-0 kubenswrapper[19170]: I0313 01:41:30.988977 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:31.584143 master-0 kubenswrapper[19170]: I0313 01:41:31.584104 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5"] Mar 13 01:41:31.642622 master-0 kubenswrapper[19170]: I0313 01:41:31.642538 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" event={"ID":"83ee835e-29dc-46f5-b4ae-1b5770efb38a","Type":"ContainerStarted","Data":"aa683915d24a9fc49ef8580ed23ab87aa0f648f8013e5d8e8518b77036617715"} Mar 13 01:41:36.701746 master-0 kubenswrapper[19170]: I0313 01:41:36.701628 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" event={"ID":"83ee835e-29dc-46f5-b4ae-1b5770efb38a","Type":"ContainerStarted","Data":"2f5d54d898569ecfee1ac51fda04a0e1a084cd5a9f312c92bf726419948a6eba"} Mar 13 01:41:36.702330 master-0 kubenswrapper[19170]: I0313 01:41:36.701870 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:50.995286 master-0 kubenswrapper[19170]: I0313 01:41:50.995198 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" Mar 13 01:41:51.043470 master-0 kubenswrapper[19170]: I0313 01:41:51.041748 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-4rkk5" podStartSLOduration=16.732656113 podStartE2EDuration="21.04171244s" podCreationTimestamp="2026-03-13 01:41:30 +0000 UTC" firstStartedPulling="2026-03-13 01:41:31.579854528 +0000 UTC m=+1352.387975518" lastFinishedPulling="2026-03-13 01:41:35.888910875 +0000 UTC m=+1356.697031845" observedRunningTime="2026-03-13 01:41:36.754305664 +0000 UTC m=+1357.562426624" watchObservedRunningTime="2026-03-13 01:41:51.04171244 +0000 UTC m=+1371.849833450" Mar 13 01:42:11.631465 master-0 kubenswrapper[19170]: I0313 01:42:11.630437 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws"] Mar 13 01:42:11.632083 master-0 kubenswrapper[19170]: I0313 01:42:11.631483 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:11.645653 master-0 kubenswrapper[19170]: I0313 01:42:11.645567 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg"] Mar 13 01:42:11.648220 master-0 kubenswrapper[19170]: I0313 01:42:11.646649 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:11.650755 master-0 kubenswrapper[19170]: I0313 01:42:11.650708 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws"] Mar 13 01:42:11.661056 master-0 kubenswrapper[19170]: I0313 01:42:11.660830 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg"] Mar 13 01:42:11.688820 master-0 kubenswrapper[19170]: I0313 01:42:11.688362 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c"] Mar 13 01:42:11.689396 master-0 kubenswrapper[19170]: I0313 01:42:11.689373 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:11.717596 master-0 kubenswrapper[19170]: I0313 01:42:11.710400 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr"] Mar 13 01:42:11.717596 master-0 kubenswrapper[19170]: I0313 01:42:11.711423 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:11.731071 master-0 kubenswrapper[19170]: I0313 01:42:11.720916 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c"] Mar 13 01:42:11.733693 master-0 kubenswrapper[19170]: I0313 01:42:11.733645 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr"] Mar 13 01:42:11.756510 master-0 kubenswrapper[19170]: I0313 01:42:11.756299 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7"] Mar 13 01:42:11.759370 master-0 kubenswrapper[19170]: I0313 01:42:11.758684 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:11.768750 master-0 kubenswrapper[19170]: I0313 01:42:11.765308 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv"] Mar 13 01:42:11.768750 master-0 kubenswrapper[19170]: I0313 01:42:11.766350 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:11.774308 master-0 kubenswrapper[19170]: I0313 01:42:11.774248 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7"] Mar 13 01:42:11.775506 master-0 kubenswrapper[19170]: I0313 01:42:11.775473 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxchg\" (UniqueName: \"kubernetes.io/projected/ecb67aa7-1260-45ea-9601-8138c6925057-kube-api-access-xxchg\") pod \"cinder-operator-controller-manager-984cd4dcf-lvsxg\" (UID: \"ecb67aa7-1260-45ea-9601-8138c6925057\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:11.775673 master-0 kubenswrapper[19170]: I0313 01:42:11.775572 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp6l5\" (UniqueName: \"kubernetes.io/projected/293b13d4-44e1-418c-ad20-c5ac6fa75764-kube-api-access-dp6l5\") pod \"barbican-operator-controller-manager-677bd678f7-4xfws\" (UID: \"293b13d4-44e1-418c-ad20-c5ac6fa75764\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:11.793762 master-0 kubenswrapper[19170]: I0313 01:42:11.793667 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv"] Mar 13 01:42:11.815646 master-0 kubenswrapper[19170]: I0313 01:42:11.815392 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk"] Mar 13 01:42:11.823473 master-0 kubenswrapper[19170]: I0313 01:42:11.823221 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:11.826187 master-0 kubenswrapper[19170]: I0313 01:42:11.826137 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 01:42:11.840605 master-0 kubenswrapper[19170]: I0313 01:42:11.840559 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk"] Mar 13 01:42:11.883531 master-0 kubenswrapper[19170]: I0313 01:42:11.883434 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wb8\" (UniqueName: \"kubernetes.io/projected/efb5943d-85e5-4726-8344-1c27e29a47b0-kube-api-access-z7wb8\") pod \"horizon-operator-controller-manager-6d9d6b584d-zrjnv\" (UID: \"efb5943d-85e5-4726-8344-1c27e29a47b0\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:11.883730 master-0 kubenswrapper[19170]: I0313 01:42:11.883562 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xbfb\" (UniqueName: \"kubernetes.io/projected/fbd141c5-2da8-4531-8898-a87e54e026e4-kube-api-access-4xbfb\") pod \"heat-operator-controller-manager-77b6666d85-drpz7\" (UID: \"fbd141c5-2da8-4531-8898-a87e54e026e4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:11.883730 master-0 kubenswrapper[19170]: I0313 01:42:11.883608 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrxp\" (UniqueName: \"kubernetes.io/projected/fb682d6e-900b-446a-8ca6-e64288790b64-kube-api-access-fzrxp\") pod \"designate-operator-controller-manager-66d56f6ff4-q8f8c\" (UID: \"fb682d6e-900b-446a-8ca6-e64288790b64\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:11.883806 master-0 kubenswrapper[19170]: I0313 01:42:11.883762 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp6l5\" (UniqueName: \"kubernetes.io/projected/293b13d4-44e1-418c-ad20-c5ac6fa75764-kube-api-access-dp6l5\") pod \"barbican-operator-controller-manager-677bd678f7-4xfws\" (UID: \"293b13d4-44e1-418c-ad20-c5ac6fa75764\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:11.888400 master-0 kubenswrapper[19170]: I0313 01:42:11.884886 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6cm\" (UniqueName: \"kubernetes.io/projected/ea0a3b3c-0392-405b-8da0-b09b21608951-kube-api-access-pm6cm\") pod \"glance-operator-controller-manager-5964f64c48-q7fhr\" (UID: \"ea0a3b3c-0392-405b-8da0-b09b21608951\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:11.888400 master-0 kubenswrapper[19170]: I0313 01:42:11.885084 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxchg\" (UniqueName: \"kubernetes.io/projected/ecb67aa7-1260-45ea-9601-8138c6925057-kube-api-access-xxchg\") pod \"cinder-operator-controller-manager-984cd4dcf-lvsxg\" (UID: \"ecb67aa7-1260-45ea-9601-8138c6925057\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:11.912624 master-0 kubenswrapper[19170]: I0313 01:42:11.912564 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxchg\" (UniqueName: \"kubernetes.io/projected/ecb67aa7-1260-45ea-9601-8138c6925057-kube-api-access-xxchg\") pod \"cinder-operator-controller-manager-984cd4dcf-lvsxg\" (UID: \"ecb67aa7-1260-45ea-9601-8138c6925057\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:11.914023 master-0 kubenswrapper[19170]: I0313 01:42:11.913976 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw"] Mar 13 01:42:11.917128 master-0 kubenswrapper[19170]: I0313 01:42:11.915068 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:11.928686 master-0 kubenswrapper[19170]: I0313 01:42:11.918033 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp6l5\" (UniqueName: \"kubernetes.io/projected/293b13d4-44e1-418c-ad20-c5ac6fa75764-kube-api-access-dp6l5\") pod \"barbican-operator-controller-manager-677bd678f7-4xfws\" (UID: \"293b13d4-44e1-418c-ad20-c5ac6fa75764\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:11.936480 master-0 kubenswrapper[19170]: I0313 01:42:11.936177 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw"] Mar 13 01:42:11.971664 master-0 kubenswrapper[19170]: I0313 01:42:11.971613 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt"] Mar 13 01:42:11.973001 master-0 kubenswrapper[19170]: I0313 01:42:11.972983 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:11.978106 master-0 kubenswrapper[19170]: I0313 01:42:11.978049 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:11.992182 master-0 kubenswrapper[19170]: I0313 01:42:11.992137 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8kc\" (UniqueName: \"kubernetes.io/projected/988fcfec-0870-495e-885c-16c5ac7f2a2a-kube-api-access-rw8kc\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:11.992394 master-0 kubenswrapper[19170]: I0313 01:42:11.992191 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wb8\" (UniqueName: \"kubernetes.io/projected/efb5943d-85e5-4726-8344-1c27e29a47b0-kube-api-access-z7wb8\") pod \"horizon-operator-controller-manager-6d9d6b584d-zrjnv\" (UID: \"efb5943d-85e5-4726-8344-1c27e29a47b0\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:11.992394 master-0 kubenswrapper[19170]: I0313 01:42:11.992234 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xbfb\" (UniqueName: \"kubernetes.io/projected/fbd141c5-2da8-4531-8898-a87e54e026e4-kube-api-access-4xbfb\") pod \"heat-operator-controller-manager-77b6666d85-drpz7\" (UID: \"fbd141c5-2da8-4531-8898-a87e54e026e4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:11.992394 master-0 kubenswrapper[19170]: I0313 01:42:11.992263 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrxp\" (UniqueName: \"kubernetes.io/projected/fb682d6e-900b-446a-8ca6-e64288790b64-kube-api-access-fzrxp\") pod \"designate-operator-controller-manager-66d56f6ff4-q8f8c\" (UID: \"fb682d6e-900b-446a-8ca6-e64288790b64\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:11.998662 master-0 kubenswrapper[19170]: I0313 01:42:11.998592 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42nh6\" (UniqueName: \"kubernetes.io/projected/70359a63-b5cb-4798-be47-e92e437baef0-kube-api-access-42nh6\") pod \"keystone-operator-controller-manager-684f77d66d-kc6gt\" (UID: \"70359a63-b5cb-4798-be47-e92e437baef0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:11.998830 master-0 kubenswrapper[19170]: I0313 01:42:11.998710 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6cm\" (UniqueName: \"kubernetes.io/projected/ea0a3b3c-0392-405b-8da0-b09b21608951-kube-api-access-pm6cm\") pod \"glance-operator-controller-manager-5964f64c48-q7fhr\" (UID: \"ea0a3b3c-0392-405b-8da0-b09b21608951\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:11.998830 master-0 kubenswrapper[19170]: I0313 01:42:11.998744 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6g5\" (UniqueName: \"kubernetes.io/projected/d4daf348-9bda-41ff-8bb5-27cc498f38ae-kube-api-access-dz6g5\") pod \"ironic-operator-controller-manager-6bbb499bbc-fb5zw\" (UID: \"d4daf348-9bda-41ff-8bb5-27cc498f38ae\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:11.998830 master-0 kubenswrapper[19170]: I0313 01:42:11.998809 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:11.998929 master-0 kubenswrapper[19170]: I0313 01:42:11.993749 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl"] Mar 13 01:42:12.000007 master-0 kubenswrapper[19170]: I0313 01:42:11.999986 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:12.009618 master-0 kubenswrapper[19170]: I0313 01:42:12.005654 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:12.040649 master-0 kubenswrapper[19170]: I0313 01:42:12.030688 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt"] Mar 13 01:42:12.043286 master-0 kubenswrapper[19170]: I0313 01:42:12.043252 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wb8\" (UniqueName: \"kubernetes.io/projected/efb5943d-85e5-4726-8344-1c27e29a47b0-kube-api-access-z7wb8\") pod \"horizon-operator-controller-manager-6d9d6b584d-zrjnv\" (UID: \"efb5943d-85e5-4726-8344-1c27e29a47b0\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:12.062665 master-0 kubenswrapper[19170]: I0313 01:42:12.062602 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrxp\" (UniqueName: \"kubernetes.io/projected/fb682d6e-900b-446a-8ca6-e64288790b64-kube-api-access-fzrxp\") pod \"designate-operator-controller-manager-66d56f6ff4-q8f8c\" (UID: \"fb682d6e-900b-446a-8ca6-e64288790b64\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:12.062843 master-0 kubenswrapper[19170]: I0313 01:42:12.062706 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl"] Mar 13 01:42:12.063097 master-0 kubenswrapper[19170]: I0313 01:42:12.063076 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xbfb\" (UniqueName: \"kubernetes.io/projected/fbd141c5-2da8-4531-8898-a87e54e026e4-kube-api-access-4xbfb\") pod \"heat-operator-controller-manager-77b6666d85-drpz7\" (UID: \"fbd141c5-2da8-4531-8898-a87e54e026e4\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:12.079389 master-0 kubenswrapper[19170]: I0313 01:42:12.076287 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6cm\" (UniqueName: \"kubernetes.io/projected/ea0a3b3c-0392-405b-8da0-b09b21608951-kube-api-access-pm6cm\") pod \"glance-operator-controller-manager-5964f64c48-q7fhr\" (UID: \"ea0a3b3c-0392-405b-8da0-b09b21608951\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: I0313 01:42:12.102824 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8kc\" (UniqueName: \"kubernetes.io/projected/988fcfec-0870-495e-885c-16c5ac7f2a2a-kube-api-access-rw8kc\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: I0313 01:42:12.102915 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbsw\" (UniqueName: \"kubernetes.io/projected/1171b60d-60b1-49d8-a067-ab6f875993ed-kube-api-access-mlbsw\") pod \"manila-operator-controller-manager-68f45f9d9f-rpqsl\" (UID: \"1171b60d-60b1-49d8-a067-ab6f875993ed\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: I0313 01:42:12.102951 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42nh6\" (UniqueName: \"kubernetes.io/projected/70359a63-b5cb-4798-be47-e92e437baef0-kube-api-access-42nh6\") pod \"keystone-operator-controller-manager-684f77d66d-kc6gt\" (UID: \"70359a63-b5cb-4798-be47-e92e437baef0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: I0313 01:42:12.102989 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6g5\" (UniqueName: \"kubernetes.io/projected/d4daf348-9bda-41ff-8bb5-27cc498f38ae-kube-api-access-dz6g5\") pod \"ironic-operator-controller-manager-6bbb499bbc-fb5zw\" (UID: \"d4daf348-9bda-41ff-8bb5-27cc498f38ae\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: I0313 01:42:12.103020 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: E0313 01:42:12.103149 19170 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:12.103652 master-0 kubenswrapper[19170]: E0313 01:42:12.103210 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert podName:988fcfec-0870-495e-885c-16c5ac7f2a2a nodeName:}" failed. No retries permitted until 2026-03-13 01:42:12.603192858 +0000 UTC m=+1393.411313818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert") pod "infra-operator-controller-manager-b8c8d7cc8-g4gmk" (UID: "988fcfec-0870-495e-885c-16c5ac7f2a2a") : secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:12.107516 master-0 kubenswrapper[19170]: I0313 01:42:12.107484 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf"] Mar 13 01:42:12.108151 master-0 kubenswrapper[19170]: I0313 01:42:12.108114 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:12.123875 master-0 kubenswrapper[19170]: I0313 01:42:12.118674 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:12.135565 master-0 kubenswrapper[19170]: I0313 01:42:12.135477 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:12.167737 master-0 kubenswrapper[19170]: I0313 01:42:12.159418 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6g5\" (UniqueName: \"kubernetes.io/projected/d4daf348-9bda-41ff-8bb5-27cc498f38ae-kube-api-access-dz6g5\") pod \"ironic-operator-controller-manager-6bbb499bbc-fb5zw\" (UID: \"d4daf348-9bda-41ff-8bb5-27cc498f38ae\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:12.167737 master-0 kubenswrapper[19170]: I0313 01:42:12.162063 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8kc\" (UniqueName: \"kubernetes.io/projected/988fcfec-0870-495e-885c-16c5ac7f2a2a-kube-api-access-rw8kc\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:12.222653 master-0 kubenswrapper[19170]: I0313 01:42:12.171032 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42nh6\" (UniqueName: \"kubernetes.io/projected/70359a63-b5cb-4798-be47-e92e437baef0-kube-api-access-42nh6\") pod \"keystone-operator-controller-manager-684f77d66d-kc6gt\" (UID: \"70359a63-b5cb-4798-be47-e92e437baef0\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:12.222653 master-0 kubenswrapper[19170]: I0313 01:42:12.171136 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q"] Mar 13 01:42:12.222653 master-0 kubenswrapper[19170]: I0313 01:42:12.180240 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:12.239675 master-0 kubenswrapper[19170]: I0313 01:42:12.239470 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf"] Mar 13 01:42:12.251755 master-0 kubenswrapper[19170]: I0313 01:42:12.248613 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz4b7\" (UniqueName: \"kubernetes.io/projected/f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e-kube-api-access-dz4b7\") pod \"mariadb-operator-controller-manager-658d4cdd5-p9fmf\" (UID: \"f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:12.251755 master-0 kubenswrapper[19170]: I0313 01:42:12.248714 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbsw\" (UniqueName: \"kubernetes.io/projected/1171b60d-60b1-49d8-a067-ab6f875993ed-kube-api-access-mlbsw\") pod \"manila-operator-controller-manager-68f45f9d9f-rpqsl\" (UID: \"1171b60d-60b1-49d8-a067-ab6f875993ed\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:12.251755 master-0 kubenswrapper[19170]: I0313 01:42:12.248810 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m72b\" (UniqueName: \"kubernetes.io/projected/7ea213da-f211-406d-8eac-88426a64c411-kube-api-access-6m72b\") pod \"neutron-operator-controller-manager-776c5696bf-7nf7q\" (UID: \"7ea213da-f211-406d-8eac-88426a64c411\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:12.279695 master-0 kubenswrapper[19170]: I0313 01:42:12.279519 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbsw\" (UniqueName: \"kubernetes.io/projected/1171b60d-60b1-49d8-a067-ab6f875993ed-kube-api-access-mlbsw\") pod \"manila-operator-controller-manager-68f45f9d9f-rpqsl\" (UID: \"1171b60d-60b1-49d8-a067-ab6f875993ed\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:12.294651 master-0 kubenswrapper[19170]: I0313 01:42:12.290723 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q"] Mar 13 01:42:12.333904 master-0 kubenswrapper[19170]: I0313 01:42:12.332306 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:12.348179 master-0 kubenswrapper[19170]: I0313 01:42:12.345918 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:12.366127 master-0 kubenswrapper[19170]: I0313 01:42:12.351886 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx"] Mar 13 01:42:12.366127 master-0 kubenswrapper[19170]: I0313 01:42:12.354077 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:12.366127 master-0 kubenswrapper[19170]: I0313 01:42:12.356277 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz4b7\" (UniqueName: \"kubernetes.io/projected/f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e-kube-api-access-dz4b7\") pod \"mariadb-operator-controller-manager-658d4cdd5-p9fmf\" (UID: \"f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:12.366127 master-0 kubenswrapper[19170]: I0313 01:42:12.356353 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m72b\" (UniqueName: \"kubernetes.io/projected/7ea213da-f211-406d-8eac-88426a64c411-kube-api-access-6m72b\") pod \"neutron-operator-controller-manager-776c5696bf-7nf7q\" (UID: \"7ea213da-f211-406d-8eac-88426a64c411\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:12.397227 master-0 kubenswrapper[19170]: I0313 01:42:12.396273 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m72b\" (UniqueName: \"kubernetes.io/projected/7ea213da-f211-406d-8eac-88426a64c411-kube-api-access-6m72b\") pod \"neutron-operator-controller-manager-776c5696bf-7nf7q\" (UID: \"7ea213da-f211-406d-8eac-88426a64c411\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:12.399830 master-0 kubenswrapper[19170]: I0313 01:42:12.399571 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45"] Mar 13 01:42:12.406956 master-0 kubenswrapper[19170]: I0313 01:42:12.406902 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:12.408547 master-0 kubenswrapper[19170]: I0313 01:42:12.408520 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz4b7\" (UniqueName: \"kubernetes.io/projected/f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e-kube-api-access-dz4b7\") pod \"mariadb-operator-controller-manager-658d4cdd5-p9fmf\" (UID: \"f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:12.410668 master-0 kubenswrapper[19170]: I0313 01:42:12.410622 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:12.431093 master-0 kubenswrapper[19170]: I0313 01:42:12.431058 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:12.451962 master-0 kubenswrapper[19170]: I0313 01:42:12.451924 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx"] Mar 13 01:42:12.468196 master-0 kubenswrapper[19170]: I0313 01:42:12.468152 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:12.470725 master-0 kubenswrapper[19170]: I0313 01:42:12.469794 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45"] Mar 13 01:42:12.470725 master-0 kubenswrapper[19170]: I0313 01:42:12.469810 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwv2\" (UniqueName: \"kubernetes.io/projected/d73229d5-0ef9-4789-b633-0a2df388c2b4-kube-api-access-4dwv2\") pod \"nova-operator-controller-manager-569cc54c5-9lfxx\" (UID: \"d73229d5-0ef9-4789-b633-0a2df388c2b4\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:12.489900 master-0 kubenswrapper[19170]: I0313 01:42:12.489839 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q"] Mar 13 01:42:12.491846 master-0 kubenswrapper[19170]: I0313 01:42:12.491822 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.505877 master-0 kubenswrapper[19170]: I0313 01:42:12.505831 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 01:42:12.506545 master-0 kubenswrapper[19170]: I0313 01:42:12.506146 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x"] Mar 13 01:42:12.524439 master-0 kubenswrapper[19170]: I0313 01:42:12.524207 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x"] Mar 13 01:42:12.524439 master-0 kubenswrapper[19170]: I0313 01:42:12.524318 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:12.532902 master-0 kubenswrapper[19170]: I0313 01:42:12.532843 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q"] Mar 13 01:42:12.542553 master-0 kubenswrapper[19170]: I0313 01:42:12.541652 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb"] Mar 13 01:42:12.542778 master-0 kubenswrapper[19170]: I0313 01:42:12.542609 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:12.556150 master-0 kubenswrapper[19170]: I0313 01:42:12.554371 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:12.556150 master-0 kubenswrapper[19170]: I0313 01:42:12.555438 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-wlzls"] Mar 13 01:42:12.557664 master-0 kubenswrapper[19170]: I0313 01:42:12.557551 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:12.569848 master-0 kubenswrapper[19170]: I0313 01:42:12.569810 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb"] Mar 13 01:42:12.572068 master-0 kubenswrapper[19170]: I0313 01:42:12.572047 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:12.572187 master-0 kubenswrapper[19170]: I0313 01:42:12.572130 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5j9s\" (UniqueName: \"kubernetes.io/projected/0a694c32-74e7-4ef6-bf41-3d67a1e88d4d-kube-api-access-p5j9s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-mhw45\" (UID: \"0a694c32-74e7-4ef6-bf41-3d67a1e88d4d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:12.572307 master-0 kubenswrapper[19170]: I0313 01:42:12.572177 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwv2\" (UniqueName: \"kubernetes.io/projected/d73229d5-0ef9-4789-b633-0a2df388c2b4-kube-api-access-4dwv2\") pod \"nova-operator-controller-manager-569cc54c5-9lfxx\" (UID: \"d73229d5-0ef9-4789-b633-0a2df388c2b4\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:12.587063 master-0 kubenswrapper[19170]: I0313 01:42:12.586768 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-wlzls"] Mar 13 01:42:12.603250 master-0 kubenswrapper[19170]: I0313 01:42:12.603211 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwv2\" (UniqueName: \"kubernetes.io/projected/d73229d5-0ef9-4789-b633-0a2df388c2b4-kube-api-access-4dwv2\") pod \"nova-operator-controller-manager-569cc54c5-9lfxx\" (UID: \"d73229d5-0ef9-4789-b633-0a2df388c2b4\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:12.620354 master-0 kubenswrapper[19170]: I0313 01:42:12.619484 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5"] Mar 13 01:42:12.626511 master-0 kubenswrapper[19170]: I0313 01:42:12.626162 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:12.628080 master-0 kubenswrapper[19170]: I0313 01:42:12.628050 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf"] Mar 13 01:42:12.629288 master-0 kubenswrapper[19170]: I0313 01:42:12.628969 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:12.672999 master-0 kubenswrapper[19170]: I0313 01:42:12.671609 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5"] Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674017 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674100 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674119 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tg95\" (UniqueName: \"kubernetes.io/projected/5fc3c833-daad-4a2f-b725-f98f7e6e019c-kube-api-access-2tg95\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674173 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvc5\" (UniqueName: \"kubernetes.io/projected/202329f1-88cf-42fe-a501-d7bad4ac5103-kube-api-access-hxvc5\") pod \"swift-operator-controller-manager-677c674df7-wlzls\" (UID: \"202329f1-88cf-42fe-a501-d7bad4ac5103\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674200 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlb5x\" (UniqueName: \"kubernetes.io/projected/4e802e44-89fb-46f6-ba84-1e4a6d50dd6d-kube-api-access-jlb5x\") pod \"ovn-operator-controller-manager-bbc5b68f9-hgg8x\" (UID: \"4e802e44-89fb-46f6-ba84-1e4a6d50dd6d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674219 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kksm6\" (UniqueName: \"kubernetes.io/projected/239051c7-ef29-440b-9d24-1122b3c5a550-kube-api-access-kksm6\") pod \"placement-operator-controller-manager-574d45c66c-cq6mb\" (UID: \"239051c7-ef29-440b-9d24-1122b3c5a550\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: I0313 01:42:12.674244 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5j9s\" (UniqueName: \"kubernetes.io/projected/0a694c32-74e7-4ef6-bf41-3d67a1e88d4d-kube-api-access-p5j9s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-mhw45\" (UID: \"0a694c32-74e7-4ef6-bf41-3d67a1e88d4d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: E0313 01:42:12.674474 19170 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:12.675126 master-0 kubenswrapper[19170]: E0313 01:42:12.674514 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert podName:988fcfec-0870-495e-885c-16c5ac7f2a2a nodeName:}" failed. No retries permitted until 2026-03-13 01:42:13.674498366 +0000 UTC m=+1394.482619326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert") pod "infra-operator-controller-manager-b8c8d7cc8-g4gmk" (UID: "988fcfec-0870-495e-885c-16c5ac7f2a2a") : secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:12.681691 master-0 kubenswrapper[19170]: I0313 01:42:12.678903 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9"] Mar 13 01:42:12.681691 master-0 kubenswrapper[19170]: I0313 01:42:12.680198 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:12.701354 master-0 kubenswrapper[19170]: I0313 01:42:12.701314 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf"] Mar 13 01:42:12.702956 master-0 kubenswrapper[19170]: I0313 01:42:12.702840 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9"] Mar 13 01:42:12.722113 master-0 kubenswrapper[19170]: I0313 01:42:12.722014 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5j9s\" (UniqueName: \"kubernetes.io/projected/0a694c32-74e7-4ef6-bf41-3d67a1e88d4d-kube-api-access-p5j9s\") pod \"octavia-operator-controller-manager-5f4f55cb5c-mhw45\" (UID: \"0a694c32-74e7-4ef6-bf41-3d67a1e88d4d\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:12.740684 master-0 kubenswrapper[19170]: I0313 01:42:12.740082 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt"] Mar 13 01:42:12.741396 master-0 kubenswrapper[19170]: I0313 01:42:12.741371 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.744289 master-0 kubenswrapper[19170]: I0313 01:42:12.744228 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 01:42:12.744500 master-0 kubenswrapper[19170]: I0313 01:42:12.744415 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 01:42:12.749229 master-0 kubenswrapper[19170]: I0313 01:42:12.749190 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:12.760462 master-0 kubenswrapper[19170]: I0313 01:42:12.759698 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt"] Mar 13 01:42:12.785129 master-0 kubenswrapper[19170]: I0313 01:42:12.784854 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.785129 master-0 kubenswrapper[19170]: I0313 01:42:12.784907 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxxxd\" (UniqueName: \"kubernetes.io/projected/c191eeae-8328-45ad-b079-9df55f82fd92-kube-api-access-lxxxd\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.785129 master-0 kubenswrapper[19170]: I0313 01:42:12.784935 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.785129 master-0 kubenswrapper[19170]: I0313 01:42:12.785079 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tg95\" (UniqueName: \"kubernetes.io/projected/5fc3c833-daad-4a2f-b725-f98f7e6e019c-kube-api-access-2tg95\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.785268 master-0 kubenswrapper[19170]: I0313 01:42:12.785140 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.785268 master-0 kubenswrapper[19170]: I0313 01:42:12.785233 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqrln\" (UniqueName: \"kubernetes.io/projected/474636d6-57d5-4f22-9d44-aabcc798f8a2-kube-api-access-pqrln\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7cjg5\" (UID: \"474636d6-57d5-4f22-9d44-aabcc798f8a2\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:12.785444 master-0 kubenswrapper[19170]: I0313 01:42:12.785367 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvc5\" (UniqueName: \"kubernetes.io/projected/202329f1-88cf-42fe-a501-d7bad4ac5103-kube-api-access-hxvc5\") pod \"swift-operator-controller-manager-677c674df7-wlzls\" (UID: \"202329f1-88cf-42fe-a501-d7bad4ac5103\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:12.785444 master-0 kubenswrapper[19170]: I0313 01:42:12.785418 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkcl\" (UniqueName: \"kubernetes.io/projected/a533e22f-4ca4-4c99-b6f2-31a6d199d0a8-kube-api-access-ptkcl\") pod \"test-operator-controller-manager-5c5cb9c4d7-dvxrf\" (UID: \"a533e22f-4ca4-4c99-b6f2-31a6d199d0a8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:12.785518 master-0 kubenswrapper[19170]: I0313 01:42:12.785459 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlb5x\" (UniqueName: \"kubernetes.io/projected/4e802e44-89fb-46f6-ba84-1e4a6d50dd6d-kube-api-access-jlb5x\") pod \"ovn-operator-controller-manager-bbc5b68f9-hgg8x\" (UID: \"4e802e44-89fb-46f6-ba84-1e4a6d50dd6d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:12.785518 master-0 kubenswrapper[19170]: I0313 01:42:12.785502 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kksm6\" (UniqueName: \"kubernetes.io/projected/239051c7-ef29-440b-9d24-1122b3c5a550-kube-api-access-kksm6\") pod \"placement-operator-controller-manager-574d45c66c-cq6mb\" (UID: \"239051c7-ef29-440b-9d24-1122b3c5a550\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:12.786687 master-0 kubenswrapper[19170]: I0313 01:42:12.785621 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkkbh\" (UniqueName: \"kubernetes.io/projected/831dd36a-211c-4c0a-9cc0-91a94194dd52-kube-api-access-rkkbh\") pod \"watcher-operator-controller-manager-6dd88c6f67-jrqq9\" (UID: \"831dd36a-211c-4c0a-9cc0-91a94194dd52\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:12.786687 master-0 kubenswrapper[19170]: E0313 01:42:12.786140 19170 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:12.786687 master-0 kubenswrapper[19170]: E0313 01:42:12.786208 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert podName:5fc3c833-daad-4a2f-b725-f98f7e6e019c nodeName:}" failed. No retries permitted until 2026-03-13 01:42:13.286192814 +0000 UTC m=+1394.094313774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" (UID: "5fc3c833-daad-4a2f-b725-f98f7e6e019c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:12.810654 master-0 kubenswrapper[19170]: I0313 01:42:12.810339 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:12.819678 master-0 kubenswrapper[19170]: I0313 01:42:12.817438 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x"] Mar 13 01:42:12.819678 master-0 kubenswrapper[19170]: I0313 01:42:12.819334 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tg95\" (UniqueName: \"kubernetes.io/projected/5fc3c833-daad-4a2f-b725-f98f7e6e019c-kube-api-access-2tg95\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:12.820896 master-0 kubenswrapper[19170]: I0313 01:42:12.820862 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kksm6\" (UniqueName: \"kubernetes.io/projected/239051c7-ef29-440b-9d24-1122b3c5a550-kube-api-access-kksm6\") pod \"placement-operator-controller-manager-574d45c66c-cq6mb\" (UID: \"239051c7-ef29-440b-9d24-1122b3c5a550\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:12.827667 master-0 kubenswrapper[19170]: I0313 01:42:12.824046 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvc5\" (UniqueName: \"kubernetes.io/projected/202329f1-88cf-42fe-a501-d7bad4ac5103-kube-api-access-hxvc5\") pod \"swift-operator-controller-manager-677c674df7-wlzls\" (UID: \"202329f1-88cf-42fe-a501-d7bad4ac5103\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:12.827667 master-0 kubenswrapper[19170]: I0313 01:42:12.826407 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" Mar 13 01:42:12.827667 master-0 kubenswrapper[19170]: I0313 01:42:12.826537 19170 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:42:12.827667 master-0 kubenswrapper[19170]: I0313 01:42:12.827457 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlb5x\" (UniqueName: \"kubernetes.io/projected/4e802e44-89fb-46f6-ba84-1e4a6d50dd6d-kube-api-access-jlb5x\") pod \"ovn-operator-controller-manager-bbc5b68f9-hgg8x\" (UID: \"4e802e44-89fb-46f6-ba84-1e4a6d50dd6d\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:12.832452 master-0 kubenswrapper[19170]: I0313 01:42:12.828970 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x"] Mar 13 01:42:12.856702 master-0 kubenswrapper[19170]: I0313 01:42:12.855828 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888063 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888174 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqrln\" (UniqueName: \"kubernetes.io/projected/474636d6-57d5-4f22-9d44-aabcc798f8a2-kube-api-access-pqrln\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7cjg5\" (UID: \"474636d6-57d5-4f22-9d44-aabcc798f8a2\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888229 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkcl\" (UniqueName: \"kubernetes.io/projected/a533e22f-4ca4-4c99-b6f2-31a6d199d0a8-kube-api-access-ptkcl\") pod \"test-operator-controller-manager-5c5cb9c4d7-dvxrf\" (UID: \"a533e22f-4ca4-4c99-b6f2-31a6d199d0a8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888288 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkkbh\" (UniqueName: \"kubernetes.io/projected/831dd36a-211c-4c0a-9cc0-91a94194dd52-kube-api-access-rkkbh\") pod \"watcher-operator-controller-manager-6dd88c6f67-jrqq9\" (UID: \"831dd36a-211c-4c0a-9cc0-91a94194dd52\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888338 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: I0313 01:42:12.888367 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxxxd\" (UniqueName: \"kubernetes.io/projected/c191eeae-8328-45ad-b079-9df55f82fd92-kube-api-access-lxxxd\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: E0313 01:42:12.889189 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: E0313 01:42:12.889237 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:13.389221578 +0000 UTC m=+1394.197342538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: E0313 01:42:12.889358 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:12.889508 master-0 kubenswrapper[19170]: E0313 01:42:12.889384 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:13.389377023 +0000 UTC m=+1394.197497983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:12.898321 master-0 kubenswrapper[19170]: I0313 01:42:12.896904 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:12.908154 master-0 kubenswrapper[19170]: I0313 01:42:12.907710 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:12.917947 master-0 kubenswrapper[19170]: I0313 01:42:12.916206 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkcl\" (UniqueName: \"kubernetes.io/projected/a533e22f-4ca4-4c99-b6f2-31a6d199d0a8-kube-api-access-ptkcl\") pod \"test-operator-controller-manager-5c5cb9c4d7-dvxrf\" (UID: \"a533e22f-4ca4-4c99-b6f2-31a6d199d0a8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:12.917947 master-0 kubenswrapper[19170]: I0313 01:42:12.917811 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkkbh\" (UniqueName: \"kubernetes.io/projected/831dd36a-211c-4c0a-9cc0-91a94194dd52-kube-api-access-rkkbh\") pod \"watcher-operator-controller-manager-6dd88c6f67-jrqq9\" (UID: \"831dd36a-211c-4c0a-9cc0-91a94194dd52\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:12.920222 master-0 kubenswrapper[19170]: I0313 01:42:12.920030 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqrln\" (UniqueName: \"kubernetes.io/projected/474636d6-57d5-4f22-9d44-aabcc798f8a2-kube-api-access-pqrln\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-7cjg5\" (UID: \"474636d6-57d5-4f22-9d44-aabcc798f8a2\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:12.921594 master-0 kubenswrapper[19170]: I0313 01:42:12.921569 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxxxd\" (UniqueName: \"kubernetes.io/projected/c191eeae-8328-45ad-b079-9df55f82fd92-kube-api-access-lxxxd\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:12.926232 master-0 kubenswrapper[19170]: I0313 01:42:12.926178 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg"] Mar 13 01:42:12.980350 master-0 kubenswrapper[19170]: I0313 01:42:12.975990 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:12.982799 master-0 kubenswrapper[19170]: W0313 01:42:12.982714 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod293b13d4_44e1_418c_ad20_c5ac6fa75764.slice/crio-a07282976f9040dc261ddd42627f0bf43c41e6f9489049511de0ebdd1bc3ebc4 WatchSource:0}: Error finding container a07282976f9040dc261ddd42627f0bf43c41e6f9489049511de0ebdd1bc3ebc4: Status 404 returned error can't find the container with id a07282976f9040dc261ddd42627f0bf43c41e6f9489049511de0ebdd1bc3ebc4 Mar 13 01:42:12.991405 master-0 kubenswrapper[19170]: I0313 01:42:12.990928 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w8qf\" (UniqueName: \"kubernetes.io/projected/69460485-b370-4065-a8de-8b7321cc10d8-kube-api-access-4w8qf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jkx8x\" (UID: \"69460485-b370-4065-a8de-8b7321cc10d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" Mar 13 01:42:13.014058 master-0 kubenswrapper[19170]: I0313 01:42:13.013599 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:13.023072 master-0 kubenswrapper[19170]: I0313 01:42:13.023029 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws"] Mar 13 01:42:13.036265 master-0 kubenswrapper[19170]: I0313 01:42:13.036218 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:13.119244 master-0 kubenswrapper[19170]: I0313 01:42:13.096722 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w8qf\" (UniqueName: \"kubernetes.io/projected/69460485-b370-4065-a8de-8b7321cc10d8-kube-api-access-4w8qf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jkx8x\" (UID: \"69460485-b370-4065-a8de-8b7321cc10d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" Mar 13 01:42:13.150590 master-0 kubenswrapper[19170]: I0313 01:42:13.139848 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w8qf\" (UniqueName: \"kubernetes.io/projected/69460485-b370-4065-a8de-8b7321cc10d8-kube-api-access-4w8qf\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jkx8x\" (UID: \"69460485-b370-4065-a8de-8b7321cc10d8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" Mar 13 01:42:13.150590 master-0 kubenswrapper[19170]: I0313 01:42:13.141276 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" event={"ID":"293b13d4-44e1-418c-ad20-c5ac6fa75764","Type":"ContainerStarted","Data":"a07282976f9040dc261ddd42627f0bf43c41e6f9489049511de0ebdd1bc3ebc4"} Mar 13 01:42:13.157414 master-0 kubenswrapper[19170]: I0313 01:42:13.157288 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" event={"ID":"ecb67aa7-1260-45ea-9601-8138c6925057","Type":"ContainerStarted","Data":"33f7d5f5f10ca189377b1b5f60364ac79b8d202ad5e2c8952ff168ab287594e5"} Mar 13 01:42:13.167056 master-0 kubenswrapper[19170]: I0313 01:42:13.167000 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" Mar 13 01:42:13.193783 master-0 kubenswrapper[19170]: I0313 01:42:13.193720 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7"] Mar 13 01:42:13.233101 master-0 kubenswrapper[19170]: W0313 01:42:13.233048 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd141c5_2da8_4531_8898_a87e54e026e4.slice/crio-f38b4588c78827ee5751a90abe777b55fcecf57668c788ea01f4f78af0df5837 WatchSource:0}: Error finding container f38b4588c78827ee5751a90abe777b55fcecf57668c788ea01f4f78af0df5837: Status 404 returned error can't find the container with id f38b4588c78827ee5751a90abe777b55fcecf57668c788ea01f4f78af0df5837 Mar 13 01:42:13.305609 master-0 kubenswrapper[19170]: I0313 01:42:13.305560 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:13.305797 master-0 kubenswrapper[19170]: E0313 01:42:13.305775 19170 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:13.305836 master-0 kubenswrapper[19170]: E0313 01:42:13.305826 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert podName:5fc3c833-daad-4a2f-b725-f98f7e6e019c nodeName:}" failed. No retries permitted until 2026-03-13 01:42:14.30581102 +0000 UTC m=+1395.113931980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" (UID: "5fc3c833-daad-4a2f-b725-f98f7e6e019c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:13.406955 master-0 kubenswrapper[19170]: I0313 01:42:13.406859 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:13.406955 master-0 kubenswrapper[19170]: I0313 01:42:13.406935 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:13.407858 master-0 kubenswrapper[19170]: E0313 01:42:13.407760 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:13.407858 master-0 kubenswrapper[19170]: E0313 01:42:13.407821 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:14.40780485 +0000 UTC m=+1395.215925810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:13.407858 master-0 kubenswrapper[19170]: E0313 01:42:13.407841 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:13.407991 master-0 kubenswrapper[19170]: E0313 01:42:13.407910 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:14.407892873 +0000 UTC m=+1395.216013833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:13.579652 master-0 kubenswrapper[19170]: I0313 01:42:13.577587 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv"] Mar 13 01:42:13.608364 master-0 kubenswrapper[19170]: I0313 01:42:13.608248 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr"] Mar 13 01:42:13.619803 master-0 kubenswrapper[19170]: W0313 01:42:13.619753 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0a3b3c_0392_405b_8da0_b09b21608951.slice/crio-f2886a354e68d8cad5cdc1f59af3ec3b32746e27f3a91444f4840cd919d0e76a WatchSource:0}: Error finding container f2886a354e68d8cad5cdc1f59af3ec3b32746e27f3a91444f4840cd919d0e76a: Status 404 returned error can't find the container with id f2886a354e68d8cad5cdc1f59af3ec3b32746e27f3a91444f4840cd919d0e76a Mar 13 01:42:13.730967 master-0 kubenswrapper[19170]: I0313 01:42:13.730884 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:13.731475 master-0 kubenswrapper[19170]: E0313 01:42:13.731081 19170 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:13.731475 master-0 kubenswrapper[19170]: E0313 01:42:13.731220 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert podName:988fcfec-0870-495e-885c-16c5ac7f2a2a nodeName:}" failed. No retries permitted until 2026-03-13 01:42:15.731175869 +0000 UTC m=+1396.539296829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert") pod "infra-operator-controller-manager-b8c8d7cc8-g4gmk" (UID: "988fcfec-0870-495e-885c-16c5ac7f2a2a") : secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:14.172728 master-0 kubenswrapper[19170]: I0313 01:42:14.172668 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" event={"ID":"efb5943d-85e5-4726-8344-1c27e29a47b0","Type":"ContainerStarted","Data":"15ead7056422c8bf47e157b9e1ac17c538de80ae73506eead7a1f44ad33d10f4"} Mar 13 01:42:14.179874 master-0 kubenswrapper[19170]: I0313 01:42:14.179781 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" event={"ID":"ea0a3b3c-0392-405b-8da0-b09b21608951","Type":"ContainerStarted","Data":"f2886a354e68d8cad5cdc1f59af3ec3b32746e27f3a91444f4840cd919d0e76a"} Mar 13 01:42:14.198689 master-0 kubenswrapper[19170]: I0313 01:42:14.198424 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" event={"ID":"fbd141c5-2da8-4531-8898-a87e54e026e4","Type":"ContainerStarted","Data":"f38b4588c78827ee5751a90abe777b55fcecf57668c788ea01f4f78af0df5837"} Mar 13 01:42:14.345582 master-0 kubenswrapper[19170]: I0313 01:42:14.345272 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:14.345582 master-0 kubenswrapper[19170]: E0313 01:42:14.345410 19170 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:14.345582 master-0 kubenswrapper[19170]: E0313 01:42:14.345481 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert podName:5fc3c833-daad-4a2f-b725-f98f7e6e019c nodeName:}" failed. No retries permitted until 2026-03-13 01:42:16.345462125 +0000 UTC m=+1397.153583085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" (UID: "5fc3c833-daad-4a2f-b725-f98f7e6e019c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:14.431774 master-0 kubenswrapper[19170]: I0313 01:42:14.431605 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx"] Mar 13 01:42:14.449697 master-0 kubenswrapper[19170]: I0313 01:42:14.449655 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:14.449813 master-0 kubenswrapper[19170]: I0313 01:42:14.449722 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:14.449864 master-0 kubenswrapper[19170]: E0313 01:42:14.449844 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:14.449925 master-0 kubenswrapper[19170]: E0313 01:42:14.449899 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:16.449882645 +0000 UTC m=+1397.258003605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:14.450208 master-0 kubenswrapper[19170]: E0313 01:42:14.450190 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:14.450252 master-0 kubenswrapper[19170]: E0313 01:42:14.450222 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:16.450214676 +0000 UTC m=+1397.258335636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:14.456572 master-0 kubenswrapper[19170]: W0313 01:42:14.455835 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd73229d5_0ef9_4789_b633_0a2df388c2b4.slice/crio-e1c8cbc8678661860f371707be436eb8de5c6fa0fd381399c860a856a6979590 WatchSource:0}: Error finding container e1c8cbc8678661860f371707be436eb8de5c6fa0fd381399c860a856a6979590: Status 404 returned error can't find the container with id e1c8cbc8678661860f371707be436eb8de5c6fa0fd381399c860a856a6979590 Mar 13 01:42:14.472665 master-0 kubenswrapper[19170]: I0313 01:42:14.472588 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt"] Mar 13 01:42:14.507495 master-0 kubenswrapper[19170]: I0313 01:42:14.506329 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf"] Mar 13 01:42:14.538537 master-0 kubenswrapper[19170]: I0313 01:42:14.537699 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl"] Mar 13 01:42:14.556444 master-0 kubenswrapper[19170]: I0313 01:42:14.556402 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45"] Mar 13 01:42:14.576668 master-0 kubenswrapper[19170]: I0313 01:42:14.576190 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c"] Mar 13 01:42:14.589481 master-0 kubenswrapper[19170]: I0313 01:42:14.589428 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x"] Mar 13 01:42:14.620665 master-0 kubenswrapper[19170]: I0313 01:42:14.618042 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw"] Mar 13 01:42:14.645096 master-0 kubenswrapper[19170]: I0313 01:42:14.645031 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q"] Mar 13 01:42:15.048893 master-0 kubenswrapper[19170]: I0313 01:42:15.048597 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x"] Mar 13 01:42:15.061658 master-0 kubenswrapper[19170]: I0313 01:42:15.060891 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-wlzls"] Mar 13 01:42:15.070946 master-0 kubenswrapper[19170]: I0313 01:42:15.069133 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf"] Mar 13 01:42:15.090003 master-0 kubenswrapper[19170]: I0313 01:42:15.077169 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5"] Mar 13 01:42:15.090003 master-0 kubenswrapper[19170]: I0313 01:42:15.087838 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb"] Mar 13 01:42:15.142651 master-0 kubenswrapper[19170]: I0313 01:42:15.141621 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9"] Mar 13 01:42:15.222310 master-0 kubenswrapper[19170]: I0313 01:42:15.217122 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" event={"ID":"70359a63-b5cb-4798-be47-e92e437baef0","Type":"ContainerStarted","Data":"c2b5001c24fd61c052a4c3129da51efd79b3376769b8aa6f57bde37336c6812c"} Mar 13 01:42:15.222310 master-0 kubenswrapper[19170]: I0313 01:42:15.218080 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" event={"ID":"0a694c32-74e7-4ef6-bf41-3d67a1e88d4d","Type":"ContainerStarted","Data":"8704e46c0fca1b3e93d6e98845399e132305070dc1d046dbd96fee9546c5c1da"} Mar 13 01:42:15.222310 master-0 kubenswrapper[19170]: I0313 01:42:15.219234 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" event={"ID":"f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e","Type":"ContainerStarted","Data":"3ee48b8cc2ff8cc06bbbfc9cfdb779f317bc91d99323cb55a622e858cec332ed"} Mar 13 01:42:15.222310 master-0 kubenswrapper[19170]: I0313 01:42:15.220295 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" event={"ID":"4e802e44-89fb-46f6-ba84-1e4a6d50dd6d","Type":"ContainerStarted","Data":"04c26206293c840ac5a8319a256a654700b1efcbc613ba0289a99fbbe0279390"} Mar 13 01:42:15.222310 master-0 kubenswrapper[19170]: I0313 01:42:15.221766 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" event={"ID":"d4daf348-9bda-41ff-8bb5-27cc498f38ae","Type":"ContainerStarted","Data":"3c4c09565379c7e23e3c49693d1a654bbdb10b37b21e210f37bb9a1da6046f8b"} Mar 13 01:42:15.228837 master-0 kubenswrapper[19170]: I0313 01:42:15.226296 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" event={"ID":"7ea213da-f211-406d-8eac-88426a64c411","Type":"ContainerStarted","Data":"5ba011f535d968aa402f4c9ea77473402964628b052cc4474ca35e931ec63d90"} Mar 13 01:42:15.239456 master-0 kubenswrapper[19170]: I0313 01:42:15.239114 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" event={"ID":"1171b60d-60b1-49d8-a067-ab6f875993ed","Type":"ContainerStarted","Data":"55995aae532867c442deee61685ff35c3e1d318ae14285ec5325da76b4c12fbe"} Mar 13 01:42:15.241130 master-0 kubenswrapper[19170]: I0313 01:42:15.241106 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" event={"ID":"d73229d5-0ef9-4789-b633-0a2df388c2b4","Type":"ContainerStarted","Data":"e1c8cbc8678661860f371707be436eb8de5c6fa0fd381399c860a856a6979590"} Mar 13 01:42:15.242810 master-0 kubenswrapper[19170]: I0313 01:42:15.242778 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" event={"ID":"fb682d6e-900b-446a-8ca6-e64288790b64","Type":"ContainerStarted","Data":"7f138b2f55c75c09ced248ebc469417a9d3fe274c9df25b31929da4820bfb938"} Mar 13 01:42:15.792651 master-0 kubenswrapper[19170]: I0313 01:42:15.792425 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:15.792858 master-0 kubenswrapper[19170]: E0313 01:42:15.792755 19170 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:15.792858 master-0 kubenswrapper[19170]: E0313 01:42:15.792827 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert podName:988fcfec-0870-495e-885c-16c5ac7f2a2a nodeName:}" failed. No retries permitted until 2026-03-13 01:42:19.792812151 +0000 UTC m=+1400.600933111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert") pod "infra-operator-controller-manager-b8c8d7cc8-g4gmk" (UID: "988fcfec-0870-495e-885c-16c5ac7f2a2a") : secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:15.995003 master-0 kubenswrapper[19170]: W0313 01:42:15.994945 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69460485_b370_4065_a8de_8b7321cc10d8.slice/crio-8683f337aca77f8281efc90e68d1f75b8829fe55ab893bd0da8f7712a2b5515c WatchSource:0}: Error finding container 8683f337aca77f8281efc90e68d1f75b8829fe55ab893bd0da8f7712a2b5515c: Status 404 returned error can't find the container with id 8683f337aca77f8281efc90e68d1f75b8829fe55ab893bd0da8f7712a2b5515c Mar 13 01:42:16.011265 master-0 kubenswrapper[19170]: W0313 01:42:16.011199 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda533e22f_4ca4_4c99_b6f2_31a6d199d0a8.slice/crio-544cd1429aa8be0cf6b962687ed51aa57b2ed0c39d8f3c48135d192827468d11 WatchSource:0}: Error finding container 544cd1429aa8be0cf6b962687ed51aa57b2ed0c39d8f3c48135d192827468d11: Status 404 returned error can't find the container with id 544cd1429aa8be0cf6b962687ed51aa57b2ed0c39d8f3c48135d192827468d11 Mar 13 01:42:16.259603 master-0 kubenswrapper[19170]: I0313 01:42:16.259331 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" event={"ID":"831dd36a-211c-4c0a-9cc0-91a94194dd52","Type":"ContainerStarted","Data":"bba7c2fd49bf5b188b567eb2e9c933c3b4aa58093af6056ea28e7cc48ba0ac62"} Mar 13 01:42:16.261044 master-0 kubenswrapper[19170]: I0313 01:42:16.260998 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" event={"ID":"69460485-b370-4065-a8de-8b7321cc10d8","Type":"ContainerStarted","Data":"8683f337aca77f8281efc90e68d1f75b8829fe55ab893bd0da8f7712a2b5515c"} Mar 13 01:42:16.262060 master-0 kubenswrapper[19170]: I0313 01:42:16.262031 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" event={"ID":"a533e22f-4ca4-4c99-b6f2-31a6d199d0a8","Type":"ContainerStarted","Data":"544cd1429aa8be0cf6b962687ed51aa57b2ed0c39d8f3c48135d192827468d11"} Mar 13 01:42:16.263041 master-0 kubenswrapper[19170]: I0313 01:42:16.263013 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" event={"ID":"202329f1-88cf-42fe-a501-d7bad4ac5103","Type":"ContainerStarted","Data":"d65fdb1efb3f11616bdc71dfe135bfeb80aa04d93ce6cb17f412a1a14adac08d"} Mar 13 01:42:16.265798 master-0 kubenswrapper[19170]: I0313 01:42:16.265770 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" event={"ID":"239051c7-ef29-440b-9d24-1122b3c5a550","Type":"ContainerStarted","Data":"581c129eec34d121f3d9a72bedc7acb4812ac79e48221cb227423691324eb53d"} Mar 13 01:42:16.403485 master-0 kubenswrapper[19170]: I0313 01:42:16.403439 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:16.403767 master-0 kubenswrapper[19170]: E0313 01:42:16.403745 19170 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:16.403830 master-0 kubenswrapper[19170]: E0313 01:42:16.403798 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert podName:5fc3c833-daad-4a2f-b725-f98f7e6e019c nodeName:}" failed. No retries permitted until 2026-03-13 01:42:20.403784789 +0000 UTC m=+1401.211905749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" (UID: "5fc3c833-daad-4a2f-b725-f98f7e6e019c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:16.504944 master-0 kubenswrapper[19170]: I0313 01:42:16.504893 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:16.505150 master-0 kubenswrapper[19170]: I0313 01:42:16.504977 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:16.505150 master-0 kubenswrapper[19170]: E0313 01:42:16.505015 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:16.505150 master-0 kubenswrapper[19170]: E0313 01:42:16.505086 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:20.505067865 +0000 UTC m=+1401.313188825 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:16.506039 master-0 kubenswrapper[19170]: E0313 01:42:16.505997 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:16.506108 master-0 kubenswrapper[19170]: E0313 01:42:16.506087 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:20.506060648 +0000 UTC m=+1401.314181618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:17.535807 master-0 kubenswrapper[19170]: I0313 01:42:17.535751 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" event={"ID":"474636d6-57d5-4f22-9d44-aabcc798f8a2","Type":"ContainerStarted","Data":"6f1ee68f42ba9ef8614013290d0d295f4979068275e84c4c2027df00fcc379ad"} Mar 13 01:42:19.797528 master-0 kubenswrapper[19170]: I0313 01:42:19.797435 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:19.798677 master-0 kubenswrapper[19170]: E0313 01:42:19.797686 19170 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:19.798677 master-0 kubenswrapper[19170]: E0313 01:42:19.797832 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert podName:988fcfec-0870-495e-885c-16c5ac7f2a2a nodeName:}" failed. No retries permitted until 2026-03-13 01:42:27.797804452 +0000 UTC m=+1408.605925442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert") pod "infra-operator-controller-manager-b8c8d7cc8-g4gmk" (UID: "988fcfec-0870-495e-885c-16c5ac7f2a2a") : secret "infra-operator-webhook-server-cert" not found Mar 13 01:42:20.420643 master-0 kubenswrapper[19170]: I0313 01:42:20.420567 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:20.420863 master-0 kubenswrapper[19170]: E0313 01:42:20.420724 19170 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:20.420863 master-0 kubenswrapper[19170]: E0313 01:42:20.420794 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert podName:5fc3c833-daad-4a2f-b725-f98f7e6e019c nodeName:}" failed. No retries permitted until 2026-03-13 01:42:28.420778102 +0000 UTC m=+1409.228899142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" (UID: "5fc3c833-daad-4a2f-b725-f98f7e6e019c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: I0313 01:42:20.521917 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: I0313 01:42:20.522013 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: E0313 01:42:20.522071 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: E0313 01:42:20.522172 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:28.522139811 +0000 UTC m=+1409.330260861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: E0313 01:42:20.522804 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:20.532700 master-0 kubenswrapper[19170]: E0313 01:42:20.522849 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:28.522835244 +0000 UTC m=+1409.330956204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:27.831143 master-0 kubenswrapper[19170]: I0313 01:42:27.831080 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:27.835975 master-0 kubenswrapper[19170]: I0313 01:42:27.835943 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/988fcfec-0870-495e-885c-16c5ac7f2a2a-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-g4gmk\" (UID: \"988fcfec-0870-495e-885c-16c5ac7f2a2a\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:28.040590 master-0 kubenswrapper[19170]: I0313 01:42:28.040524 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:28.443210 master-0 kubenswrapper[19170]: I0313 01:42:28.443140 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:28.448905 master-0 kubenswrapper[19170]: I0313 01:42:28.448854 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fc3c833-daad-4a2f-b725-f98f7e6e019c-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q\" (UID: \"5fc3c833-daad-4a2f-b725-f98f7e6e019c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:28.544696 master-0 kubenswrapper[19170]: I0313 01:42:28.544609 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:28.544902 master-0 kubenswrapper[19170]: E0313 01:42:28.544816 19170 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 01:42:28.544940 master-0 kubenswrapper[19170]: E0313 01:42:28.544921 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:44.544900517 +0000 UTC m=+1425.353021477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "metrics-server-cert" not found Mar 13 01:42:28.545011 master-0 kubenswrapper[19170]: I0313 01:42:28.544941 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:28.545532 master-0 kubenswrapper[19170]: E0313 01:42:28.545472 19170 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 01:42:28.545667 master-0 kubenswrapper[19170]: E0313 01:42:28.545596 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs podName:c191eeae-8328-45ad-b079-9df55f82fd92 nodeName:}" failed. No retries permitted until 2026-03-13 01:42:44.545568396 +0000 UTC m=+1425.353689366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-ptkrt" (UID: "c191eeae-8328-45ad-b079-9df55f82fd92") : secret "webhook-server-cert" not found Mar 13 01:42:28.738234 master-0 kubenswrapper[19170]: I0313 01:42:28.738119 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:36.217955 master-0 kubenswrapper[19170]: I0313 01:42:36.215613 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q"] Mar 13 01:42:36.287682 master-0 kubenswrapper[19170]: W0313 01:42:36.283810 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc3c833_daad_4a2f_b725_f98f7e6e019c.slice/crio-457538f3e6b1db9b8166a203d74c0091ec9c87e3353fa9ce965df25ec78d7dc2 WatchSource:0}: Error finding container 457538f3e6b1db9b8166a203d74c0091ec9c87e3353fa9ce965df25ec78d7dc2: Status 404 returned error can't find the container with id 457538f3e6b1db9b8166a203d74c0091ec9c87e3353fa9ce965df25ec78d7dc2 Mar 13 01:42:36.342071 master-0 kubenswrapper[19170]: I0313 01:42:36.342014 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk"] Mar 13 01:42:36.378879 master-0 kubenswrapper[19170]: I0313 01:42:36.378808 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" event={"ID":"ecb67aa7-1260-45ea-9601-8138c6925057","Type":"ContainerStarted","Data":"8515d57ee8abec28665ac3917492e2503c00b26163d3867223afec02ffdc4550"} Mar 13 01:42:36.379063 master-0 kubenswrapper[19170]: I0313 01:42:36.378934 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:36.388748 master-0 kubenswrapper[19170]: I0313 01:42:36.384990 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" event={"ID":"ea0a3b3c-0392-405b-8da0-b09b21608951","Type":"ContainerStarted","Data":"efcac85fbf422069c784193d0f9c0d2dafbdd264c6ecc67afce351102e314a19"} Mar 13 01:42:36.388748 master-0 kubenswrapper[19170]: I0313 01:42:36.385768 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:36.388748 master-0 kubenswrapper[19170]: I0313 01:42:36.386984 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" event={"ID":"5fc3c833-daad-4a2f-b725-f98f7e6e019c","Type":"ContainerStarted","Data":"457538f3e6b1db9b8166a203d74c0091ec9c87e3353fa9ce965df25ec78d7dc2"} Mar 13 01:42:36.389734 master-0 kubenswrapper[19170]: I0313 01:42:36.389699 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" event={"ID":"293b13d4-44e1-418c-ad20-c5ac6fa75764","Type":"ContainerStarted","Data":"96736af0d8db86f56896cda1bdeba61f3fa15c29acb0a5f5865f7966cf4565a3"} Mar 13 01:42:36.389854 master-0 kubenswrapper[19170]: I0313 01:42:36.389838 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:36.429093 master-0 kubenswrapper[19170]: I0313 01:42:36.427593 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" podStartSLOduration=11.8035048 podStartE2EDuration="25.427576009s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:12.826482353 +0000 UTC m=+1393.634603313" lastFinishedPulling="2026-03-13 01:42:26.450553562 +0000 UTC m=+1407.258674522" observedRunningTime="2026-03-13 01:42:36.405332342 +0000 UTC m=+1417.213453302" watchObservedRunningTime="2026-03-13 01:42:36.427576009 +0000 UTC m=+1417.235696969" Mar 13 01:42:36.461657 master-0 kubenswrapper[19170]: I0313 01:42:36.459390 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" podStartSLOduration=9.467683436 podStartE2EDuration="25.459362065s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:13.621771627 +0000 UTC m=+1394.429892587" lastFinishedPulling="2026-03-13 01:42:29.613450256 +0000 UTC m=+1410.421571216" observedRunningTime="2026-03-13 01:42:36.435895744 +0000 UTC m=+1417.244016704" watchObservedRunningTime="2026-03-13 01:42:36.459362065 +0000 UTC m=+1417.267483025" Mar 13 01:42:36.476130 master-0 kubenswrapper[19170]: I0313 01:42:36.475744 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" podStartSLOduration=12.061125207 podStartE2EDuration="25.475717216s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:13.035995974 +0000 UTC m=+1393.844116934" lastFinishedPulling="2026-03-13 01:42:26.450587983 +0000 UTC m=+1407.258708943" observedRunningTime="2026-03-13 01:42:36.455855297 +0000 UTC m=+1417.263976257" watchObservedRunningTime="2026-03-13 01:42:36.475717216 +0000 UTC m=+1417.283838186" Mar 13 01:42:37.442923 master-0 kubenswrapper[19170]: I0313 01:42:37.442871 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" event={"ID":"202329f1-88cf-42fe-a501-d7bad4ac5103","Type":"ContainerStarted","Data":"edc7556211279c98866cf40ce356e3ec3e97259299944ab1936c2d2618fd80bd"} Mar 13 01:42:37.443837 master-0 kubenswrapper[19170]: I0313 01:42:37.443815 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:37.454860 master-0 kubenswrapper[19170]: I0313 01:42:37.454818 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" event={"ID":"0a694c32-74e7-4ef6-bf41-3d67a1e88d4d","Type":"ContainerStarted","Data":"7587ae1769dd3bd499d50b51ecdf0dad842b1c393aca94c5dae61b9cd6da23cc"} Mar 13 01:42:37.459647 master-0 kubenswrapper[19170]: I0313 01:42:37.455627 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:37.478654 master-0 kubenswrapper[19170]: I0313 01:42:37.476193 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" event={"ID":"4e802e44-89fb-46f6-ba84-1e4a6d50dd6d","Type":"ContainerStarted","Data":"af43d579dc5c92b8909a4fdd01b496ccd4dd28d89ea8333f97c13bd9004cd4d1"} Mar 13 01:42:37.478654 master-0 kubenswrapper[19170]: I0313 01:42:37.477117 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:37.478654 master-0 kubenswrapper[19170]: I0313 01:42:37.478501 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" event={"ID":"7ea213da-f211-406d-8eac-88426a64c411","Type":"ContainerStarted","Data":"0f6cc149f65fcb8f81961833f8accbeee0676e6895395ac3ca775af803ba9c19"} Mar 13 01:42:37.478927 master-0 kubenswrapper[19170]: I0313 01:42:37.478880 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:37.503655 master-0 kubenswrapper[19170]: I0313 01:42:37.499846 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" podStartSLOduration=6.5215569 podStartE2EDuration="25.499829123s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:15.988744747 +0000 UTC m=+1396.796865707" lastFinishedPulling="2026-03-13 01:42:34.96701693 +0000 UTC m=+1415.775137930" observedRunningTime="2026-03-13 01:42:37.487006861 +0000 UTC m=+1418.295127821" watchObservedRunningTime="2026-03-13 01:42:37.499829123 +0000 UTC m=+1418.307950083" Mar 13 01:42:37.503655 master-0 kubenswrapper[19170]: I0313 01:42:37.501012 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" event={"ID":"efb5943d-85e5-4726-8344-1c27e29a47b0","Type":"ContainerStarted","Data":"44d56f34fec91b42511f16635d02925f9a605b20fd099394010dc6045fbc9223"} Mar 13 01:42:37.503655 master-0 kubenswrapper[19170]: I0313 01:42:37.501757 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:37.521824 master-0 kubenswrapper[19170]: I0313 01:42:37.521771 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" event={"ID":"474636d6-57d5-4f22-9d44-aabcc798f8a2","Type":"ContainerStarted","Data":"6ffdde4e24d51041ae3f3b0ae60b3259179765d2ac9edd42ea15af8e76e930b0"} Mar 13 01:42:37.525721 master-0 kubenswrapper[19170]: I0313 01:42:37.522527 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:37.539653 master-0 kubenswrapper[19170]: I0313 01:42:37.537515 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" event={"ID":"1171b60d-60b1-49d8-a067-ab6f875993ed","Type":"ContainerStarted","Data":"3984d184f370805e1bad1519ccf6d56b11f4c2da03fcad75a5dfb915fd1685ca"} Mar 13 01:42:37.539653 master-0 kubenswrapper[19170]: I0313 01:42:37.538565 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:37.550656 master-0 kubenswrapper[19170]: I0313 01:42:37.549837 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" event={"ID":"831dd36a-211c-4c0a-9cc0-91a94194dd52","Type":"ContainerStarted","Data":"cbfd0f6a8d32088d85ff0f1171f8a82c5b8805f8f5447a2f8c57b6f2623c5cf3"} Mar 13 01:42:37.550656 master-0 kubenswrapper[19170]: I0313 01:42:37.550613 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:37.572416 master-0 kubenswrapper[19170]: I0313 01:42:37.569808 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" event={"ID":"239051c7-ef29-440b-9d24-1122b3c5a550","Type":"ContainerStarted","Data":"cfc2b9095f832e6dc17695dd6b64fa9b17d2a069ba692a3ba7f11888a892ed13"} Mar 13 01:42:37.572416 master-0 kubenswrapper[19170]: I0313 01:42:37.570534 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:37.590894 master-0 kubenswrapper[19170]: I0313 01:42:37.590830 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" event={"ID":"f8a6e5c5-4e1d-40fb-95c8-50ddc6dfaf0e","Type":"ContainerStarted","Data":"5b939c8b4a3b9ea0a1503fdc92a2ea7b3f3fb421bfcb6978c20a91ebcf7c6f99"} Mar 13 01:42:37.593312 master-0 kubenswrapper[19170]: I0313 01:42:37.593290 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:37.597647 master-0 kubenswrapper[19170]: I0313 01:42:37.595944 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" event={"ID":"fbd141c5-2da8-4531-8898-a87e54e026e4","Type":"ContainerStarted","Data":"634764d4c026f8897c264019cd2296803b0c66282beebfed59f2320f05773325"} Mar 13 01:42:37.597647 master-0 kubenswrapper[19170]: I0313 01:42:37.596481 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:37.597647 master-0 kubenswrapper[19170]: I0313 01:42:37.597560 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" event={"ID":"69460485-b370-4065-a8de-8b7321cc10d8","Type":"ContainerStarted","Data":"67964bf858f743a09d49dd75692679132e95517d0915baa62b9f77fd8240b4c4"} Mar 13 01:42:37.614654 master-0 kubenswrapper[19170]: I0313 01:42:37.613203 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" event={"ID":"a533e22f-4ca4-4c99-b6f2-31a6d199d0a8","Type":"ContainerStarted","Data":"658ad4dcee4cacd153ccdcc90369e04091b1a666c1d9dc9a747d8d392e1f906d"} Mar 13 01:42:37.614654 master-0 kubenswrapper[19170]: I0313 01:42:37.613463 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:37.615907 master-0 kubenswrapper[19170]: I0313 01:42:37.615851 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" event={"ID":"fb682d6e-900b-446a-8ca6-e64288790b64","Type":"ContainerStarted","Data":"777fc37c3c3ffcde9287930f5f4652a2d3fa4d290c6b89db5c0fad11e687687e"} Mar 13 01:42:37.616676 master-0 kubenswrapper[19170]: I0313 01:42:37.616646 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:37.628709 master-0 kubenswrapper[19170]: I0313 01:42:37.628668 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" event={"ID":"d4daf348-9bda-41ff-8bb5-27cc498f38ae","Type":"ContainerStarted","Data":"81f9f2fb7a339cad55a01e8ab6ec62b255a7f5e912d5fc51369071c96378d35d"} Mar 13 01:42:37.632694 master-0 kubenswrapper[19170]: I0313 01:42:37.629242 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:37.658656 master-0 kubenswrapper[19170]: I0313 01:42:37.654325 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" event={"ID":"70359a63-b5cb-4798-be47-e92e437baef0","Type":"ContainerStarted","Data":"50bd49b3c46520970e453a55d85f5d3c1b65d5ce7a979b33f330c9ee37403f16"} Mar 13 01:42:37.658656 master-0 kubenswrapper[19170]: I0313 01:42:37.655474 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:37.669686 master-0 kubenswrapper[19170]: I0313 01:42:37.669610 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" event={"ID":"d73229d5-0ef9-4789-b633-0a2df388c2b4","Type":"ContainerStarted","Data":"e2302ac8e0ada5fb00abb32395083eb5aea81e68cfd159d724bb490edf8420ba"} Mar 13 01:42:37.674153 master-0 kubenswrapper[19170]: I0313 01:42:37.670358 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:37.688852 master-0 kubenswrapper[19170]: I0313 01:42:37.686524 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" event={"ID":"988fcfec-0870-495e-885c-16c5ac7f2a2a","Type":"ContainerStarted","Data":"c973e708e031ae5ee27c7eab470b05e42cfab0ae7a24d14e2989eb41432045d5"} Mar 13 01:42:37.757719 master-0 kubenswrapper[19170]: I0313 01:42:37.755414 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" podStartSLOduration=6.32858412 podStartE2EDuration="26.755394316s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.540266155 +0000 UTC m=+1395.348387115" lastFinishedPulling="2026-03-13 01:42:34.967076311 +0000 UTC m=+1415.775197311" observedRunningTime="2026-03-13 01:42:37.53413258 +0000 UTC m=+1418.342253540" watchObservedRunningTime="2026-03-13 01:42:37.755394316 +0000 UTC m=+1418.563515266" Mar 13 01:42:37.757719 master-0 kubenswrapper[19170]: I0313 01:42:37.757536 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" podStartSLOduration=5.3305916159999995 podStartE2EDuration="25.757529247s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.540065788 +0000 UTC m=+1395.348186748" lastFinishedPulling="2026-03-13 01:42:34.967003379 +0000 UTC m=+1415.775124379" observedRunningTime="2026-03-13 01:42:37.753006439 +0000 UTC m=+1418.561127399" watchObservedRunningTime="2026-03-13 01:42:37.757529247 +0000 UTC m=+1418.565650207" Mar 13 01:42:38.237084 master-0 kubenswrapper[19170]: I0313 01:42:38.235506 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" podStartSLOduration=6.855924347 podStartE2EDuration="27.235484089s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.586921443 +0000 UTC m=+1395.395042393" lastFinishedPulling="2026-03-13 01:42:34.966481135 +0000 UTC m=+1415.774602135" observedRunningTime="2026-03-13 01:42:38.230370365 +0000 UTC m=+1419.038491315" watchObservedRunningTime="2026-03-13 01:42:38.235484089 +0000 UTC m=+1419.043605049" Mar 13 01:42:38.440651 master-0 kubenswrapper[19170]: I0313 01:42:38.436893 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" podStartSLOduration=6.749602127 podStartE2EDuration="26.436870676s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:15.988494249 +0000 UTC m=+1396.796615209" lastFinishedPulling="2026-03-13 01:42:35.675762788 +0000 UTC m=+1416.483883758" observedRunningTime="2026-03-13 01:42:38.427895313 +0000 UTC m=+1419.236016273" watchObservedRunningTime="2026-03-13 01:42:38.436870676 +0000 UTC m=+1419.244991636" Mar 13 01:42:39.206920 master-0 kubenswrapper[19170]: I0313 01:42:39.206816 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" podStartSLOduration=8.839970018 podStartE2EDuration="27.206793817s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:16.600250532 +0000 UTC m=+1397.408371492" lastFinishedPulling="2026-03-13 01:42:34.967074291 +0000 UTC m=+1415.775195291" observedRunningTime="2026-03-13 01:42:38.923584195 +0000 UTC m=+1419.731705185" watchObservedRunningTime="2026-03-13 01:42:39.206793817 +0000 UTC m=+1420.014914777" Mar 13 01:42:39.575704 master-0 kubenswrapper[19170]: I0313 01:42:39.575484 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" podStartSLOduration=7.917269387 podStartE2EDuration="27.575428769s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:16.022687449 +0000 UTC m=+1396.830808409" lastFinishedPulling="2026-03-13 01:42:35.680846801 +0000 UTC m=+1416.488967791" observedRunningTime="2026-03-13 01:42:39.566724203 +0000 UTC m=+1420.374845173" watchObservedRunningTime="2026-03-13 01:42:39.575428769 +0000 UTC m=+1420.383549729" Mar 13 01:42:40.343918 master-0 kubenswrapper[19170]: I0313 01:42:40.343725 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" podStartSLOduration=8.641720971 podStartE2EDuration="28.343687493s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:16.01937699 +0000 UTC m=+1396.827497950" lastFinishedPulling="2026-03-13 01:42:35.721343472 +0000 UTC m=+1416.529464472" observedRunningTime="2026-03-13 01:42:40.329779051 +0000 UTC m=+1421.137900051" watchObservedRunningTime="2026-03-13 01:42:40.343687493 +0000 UTC m=+1421.151808503" Mar 13 01:42:40.869651 master-0 kubenswrapper[19170]: I0313 01:42:40.866154 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" podStartSLOduration=16.653422072 podStartE2EDuration="29.86613764s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:13.237837324 +0000 UTC m=+1394.045958284" lastFinishedPulling="2026-03-13 01:42:26.450552892 +0000 UTC m=+1407.258673852" observedRunningTime="2026-03-13 01:42:40.864232186 +0000 UTC m=+1421.672353156" watchObservedRunningTime="2026-03-13 01:42:40.86613764 +0000 UTC m=+1421.674258600" Mar 13 01:42:40.881925 master-0 kubenswrapper[19170]: I0313 01:42:40.881858 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jkx8x" podStartSLOduration=9.057061304 podStartE2EDuration="28.881836032s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:15.996432309 +0000 UTC m=+1396.804553269" lastFinishedPulling="2026-03-13 01:42:35.821206997 +0000 UTC m=+1416.629327997" observedRunningTime="2026-03-13 01:42:40.839030006 +0000 UTC m=+1421.647150986" watchObservedRunningTime="2026-03-13 01:42:40.881836032 +0000 UTC m=+1421.689956992" Mar 13 01:42:40.902722 master-0 kubenswrapper[19170]: I0313 01:42:40.902608 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" podStartSLOduration=17.050277781 podStartE2EDuration="29.902585657s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:13.598288228 +0000 UTC m=+1394.406409188" lastFinishedPulling="2026-03-13 01:42:26.450596104 +0000 UTC m=+1407.258717064" observedRunningTime="2026-03-13 01:42:40.895915549 +0000 UTC m=+1421.704036509" watchObservedRunningTime="2026-03-13 01:42:40.902585657 +0000 UTC m=+1421.710706617" Mar 13 01:42:40.970111 master-0 kubenswrapper[19170]: I0313 01:42:40.970033 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" podStartSLOduration=8.549347807 podStartE2EDuration="29.970013248s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.470074826 +0000 UTC m=+1395.278195786" lastFinishedPulling="2026-03-13 01:42:35.890740267 +0000 UTC m=+1416.698861227" observedRunningTime="2026-03-13 01:42:40.954854871 +0000 UTC m=+1421.762975841" watchObservedRunningTime="2026-03-13 01:42:40.970013248 +0000 UTC m=+1421.778134208" Mar 13 01:42:40.990712 master-0 kubenswrapper[19170]: I0313 01:42:40.989788 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" podStartSLOduration=9.558452039 podStartE2EDuration="29.989768795s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.53522595 +0000 UTC m=+1395.343346910" lastFinishedPulling="2026-03-13 01:42:34.966542666 +0000 UTC m=+1415.774663666" observedRunningTime="2026-03-13 01:42:40.98358282 +0000 UTC m=+1421.791703780" watchObservedRunningTime="2026-03-13 01:42:40.989768795 +0000 UTC m=+1421.797889755" Mar 13 01:42:41.022791 master-0 kubenswrapper[19170]: I0313 01:42:41.022724 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" podStartSLOduration=9.607766865 podStartE2EDuration="30.022705923s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.552160274 +0000 UTC m=+1395.360281234" lastFinishedPulling="2026-03-13 01:42:34.967099302 +0000 UTC m=+1415.775220292" observedRunningTime="2026-03-13 01:42:41.020850001 +0000 UTC m=+1421.828970961" watchObservedRunningTime="2026-03-13 01:42:41.022705923 +0000 UTC m=+1421.830826883" Mar 13 01:42:41.053197 master-0 kubenswrapper[19170]: I0313 01:42:41.053114 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" podStartSLOduration=8.825595336 podStartE2EDuration="30.05308977s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.518821472 +0000 UTC m=+1395.326942422" lastFinishedPulling="2026-03-13 01:42:35.746315896 +0000 UTC m=+1416.554436856" observedRunningTime="2026-03-13 01:42:41.049809187 +0000 UTC m=+1421.857930147" watchObservedRunningTime="2026-03-13 01:42:41.05308977 +0000 UTC m=+1421.861210730" Mar 13 01:42:41.081472 master-0 kubenswrapper[19170]: I0313 01:42:41.081406 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" podStartSLOduration=12.14625175 podStartE2EDuration="30.081357816s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.517363215 +0000 UTC m=+1395.325484175" lastFinishedPulling="2026-03-13 01:42:32.452469281 +0000 UTC m=+1413.260590241" observedRunningTime="2026-03-13 01:42:41.078563448 +0000 UTC m=+1421.886684428" watchObservedRunningTime="2026-03-13 01:42:41.081357816 +0000 UTC m=+1421.889478776" Mar 13 01:42:41.145767 master-0 kubenswrapper[19170]: I0313 01:42:41.144717 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" podStartSLOduration=9.66585108 podStartE2EDuration="30.144702132s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:14.488101036 +0000 UTC m=+1395.296221996" lastFinishedPulling="2026-03-13 01:42:34.966952058 +0000 UTC m=+1415.775073048" observedRunningTime="2026-03-13 01:42:41.118339699 +0000 UTC m=+1421.926460659" watchObservedRunningTime="2026-03-13 01:42:41.144702132 +0000 UTC m=+1421.952823092" Mar 13 01:42:41.982737 master-0 kubenswrapper[19170]: I0313 01:42:41.981590 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-4xfws" Mar 13 01:42:42.009783 master-0 kubenswrapper[19170]: I0313 01:42:42.009740 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-lvsxg" Mar 13 01:42:42.111774 master-0 kubenswrapper[19170]: I0313 01:42:42.111718 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-drpz7" Mar 13 01:42:42.122904 master-0 kubenswrapper[19170]: I0313 01:42:42.121688 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-zrjnv" Mar 13 01:42:42.335559 master-0 kubenswrapper[19170]: I0313 01:42:42.335435 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-q8f8c" Mar 13 01:42:42.351194 master-0 kubenswrapper[19170]: I0313 01:42:42.351140 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-q7fhr" Mar 13 01:42:42.414555 master-0 kubenswrapper[19170]: I0313 01:42:42.414507 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-fb5zw" Mar 13 01:42:42.434728 master-0 kubenswrapper[19170]: I0313 01:42:42.434685 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-kc6gt" Mar 13 01:42:42.471332 master-0 kubenswrapper[19170]: I0313 01:42:42.471265 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-rpqsl" Mar 13 01:42:42.546326 master-0 kubenswrapper[19170]: I0313 01:42:42.546282 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-p9fmf" Mar 13 01:42:42.586256 master-0 kubenswrapper[19170]: I0313 01:42:42.585980 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7nf7q" Mar 13 01:42:42.734649 master-0 kubenswrapper[19170]: I0313 01:42:42.734131 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" event={"ID":"988fcfec-0870-495e-885c-16c5ac7f2a2a","Type":"ContainerStarted","Data":"5873ef06263d2e7dbdd4222e54385ac52bf62c8d03fcbddfcfa66c27ee6561cd"} Mar 13 01:42:42.738643 master-0 kubenswrapper[19170]: I0313 01:42:42.735096 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:42.743433 master-0 kubenswrapper[19170]: I0313 01:42:42.743390 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" event={"ID":"5fc3c833-daad-4a2f-b725-f98f7e6e019c","Type":"ContainerStarted","Data":"e06c8fe205f4aefdfa3093c678e6a425c0c9905a638382aa2ce3974de4497c7b"} Mar 13 01:42:42.750644 master-0 kubenswrapper[19170]: I0313 01:42:42.746802 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:42.759946 master-0 kubenswrapper[19170]: I0313 01:42:42.759902 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-9lfxx" Mar 13 01:42:42.762115 master-0 kubenswrapper[19170]: I0313 01:42:42.762062 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" podStartSLOduration=25.765443213 podStartE2EDuration="31.762042001s" podCreationTimestamp="2026-03-13 01:42:11 +0000 UTC" firstStartedPulling="2026-03-13 01:42:36.367755213 +0000 UTC m=+1417.175876173" lastFinishedPulling="2026-03-13 01:42:42.364353981 +0000 UTC m=+1423.172474961" observedRunningTime="2026-03-13 01:42:42.759775527 +0000 UTC m=+1423.567896487" watchObservedRunningTime="2026-03-13 01:42:42.762042001 +0000 UTC m=+1423.570162961" Mar 13 01:42:42.811877 master-0 kubenswrapper[19170]: I0313 01:42:42.811710 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" podStartSLOduration=24.73769195 podStartE2EDuration="30.81169349s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="2026-03-13 01:42:36.291332709 +0000 UTC m=+1417.099453669" lastFinishedPulling="2026-03-13 01:42:42.365334239 +0000 UTC m=+1423.173455209" observedRunningTime="2026-03-13 01:42:42.790134633 +0000 UTC m=+1423.598255613" watchObservedRunningTime="2026-03-13 01:42:42.81169349 +0000 UTC m=+1423.619814450" Mar 13 01:42:42.817241 master-0 kubenswrapper[19170]: I0313 01:42:42.817193 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-mhw45" Mar 13 01:42:42.861934 master-0 kubenswrapper[19170]: I0313 01:42:42.860878 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hgg8x" Mar 13 01:42:42.909655 master-0 kubenswrapper[19170]: I0313 01:42:42.909592 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-cq6mb" Mar 13 01:42:42.912845 master-0 kubenswrapper[19170]: I0313 01:42:42.912802 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-wlzls" Mar 13 01:42:42.986764 master-0 kubenswrapper[19170]: I0313 01:42:42.985974 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-7cjg5" Mar 13 01:42:43.018398 master-0 kubenswrapper[19170]: I0313 01:42:43.018048 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-dvxrf" Mar 13 01:42:43.047359 master-0 kubenswrapper[19170]: I0313 01:42:43.047120 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-jrqq9" Mar 13 01:42:44.578986 master-0 kubenswrapper[19170]: I0313 01:42:44.578917 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:44.579884 master-0 kubenswrapper[19170]: I0313 01:42:44.579046 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:44.582707 master-0 kubenswrapper[19170]: I0313 01:42:44.582543 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:44.585799 master-0 kubenswrapper[19170]: I0313 01:42:44.582986 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c191eeae-8328-45ad-b079-9df55f82fd92-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-ptkrt\" (UID: \"c191eeae-8328-45ad-b079-9df55f82fd92\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:44.943017 master-0 kubenswrapper[19170]: I0313 01:42:44.942975 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:45.521414 master-0 kubenswrapper[19170]: I0313 01:42:45.521325 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt"] Mar 13 01:42:45.533003 master-0 kubenswrapper[19170]: W0313 01:42:45.532944 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc191eeae_8328_45ad_b079_9df55f82fd92.slice/crio-6724c01fd2f98cc8fa81e95bee6b7c41f94fd014468350a23c0788bb22886944 WatchSource:0}: Error finding container 6724c01fd2f98cc8fa81e95bee6b7c41f94fd014468350a23c0788bb22886944: Status 404 returned error can't find the container with id 6724c01fd2f98cc8fa81e95bee6b7c41f94fd014468350a23c0788bb22886944 Mar 13 01:42:45.788670 master-0 kubenswrapper[19170]: I0313 01:42:45.786007 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" event={"ID":"c191eeae-8328-45ad-b079-9df55f82fd92","Type":"ContainerStarted","Data":"a5d4320a8f162a635f14ed3ce1d591650a1199623a34d1372fd011f5be4371de"} Mar 13 01:42:45.788670 master-0 kubenswrapper[19170]: I0313 01:42:45.786087 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" event={"ID":"c191eeae-8328-45ad-b079-9df55f82fd92","Type":"ContainerStarted","Data":"6724c01fd2f98cc8fa81e95bee6b7c41f94fd014468350a23c0788bb22886944"} Mar 13 01:42:45.788670 master-0 kubenswrapper[19170]: I0313 01:42:45.786771 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:42:45.827473 master-0 kubenswrapper[19170]: I0313 01:42:45.827399 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" podStartSLOduration=33.827379325 podStartE2EDuration="33.827379325s" podCreationTimestamp="2026-03-13 01:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:42:45.824746721 +0000 UTC m=+1426.632867701" watchObservedRunningTime="2026-03-13 01:42:45.827379325 +0000 UTC m=+1426.635500295" Mar 13 01:42:48.050621 master-0 kubenswrapper[19170]: I0313 01:42:48.050546 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-g4gmk" Mar 13 01:42:48.748713 master-0 kubenswrapper[19170]: I0313 01:42:48.748582 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-ftt2q" Mar 13 01:42:54.952781 master-0 kubenswrapper[19170]: I0313 01:42:54.952695 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-ptkrt" Mar 13 01:43:38.256035 master-0 kubenswrapper[19170]: I0313 01:43:38.251407 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:43:38.256035 master-0 kubenswrapper[19170]: I0313 01:43:38.253061 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.258367 master-0 kubenswrapper[19170]: I0313 01:43:38.257082 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 01:43:38.258367 master-0 kubenswrapper[19170]: I0313 01:43:38.257164 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 01:43:38.258367 master-0 kubenswrapper[19170]: I0313 01:43:38.257204 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 01:43:38.290987 master-0 kubenswrapper[19170]: I0313 01:43:38.290947 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:43:38.326736 master-0 kubenswrapper[19170]: I0313 01:43:38.324870 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.326736 master-0 kubenswrapper[19170]: I0313 01:43:38.324941 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2m94\" (UniqueName: \"kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.343459 master-0 kubenswrapper[19170]: I0313 01:43:38.343160 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:43:38.345543 master-0 kubenswrapper[19170]: I0313 01:43:38.345509 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.347996 master-0 kubenswrapper[19170]: I0313 01:43:38.347536 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 01:43:38.358512 master-0 kubenswrapper[19170]: I0313 01:43:38.358440 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:43:38.426718 master-0 kubenswrapper[19170]: I0313 01:43:38.426680 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.427262 master-0 kubenswrapper[19170]: I0313 01:43:38.427049 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxhw\" (UniqueName: \"kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.427262 master-0 kubenswrapper[19170]: I0313 01:43:38.427165 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.427262 master-0 kubenswrapper[19170]: I0313 01:43:38.427228 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2m94\" (UniqueName: \"kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.427475 master-0 kubenswrapper[19170]: I0313 01:43:38.427412 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.429650 master-0 kubenswrapper[19170]: I0313 01:43:38.429610 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.443028 master-0 kubenswrapper[19170]: I0313 01:43:38.442977 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2m94\" (UniqueName: \"kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94\") pod \"dnsmasq-dns-685c76cf85-dkz47\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.529109 master-0 kubenswrapper[19170]: I0313 01:43:38.528981 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.529109 master-0 kubenswrapper[19170]: I0313 01:43:38.529074 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxhw\" (UniqueName: \"kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.529511 master-0 kubenswrapper[19170]: I0313 01:43:38.529342 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.529915 master-0 kubenswrapper[19170]: I0313 01:43:38.529884 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.530230 master-0 kubenswrapper[19170]: I0313 01:43:38.530199 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.554663 master-0 kubenswrapper[19170]: I0313 01:43:38.552343 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxhw\" (UniqueName: \"kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw\") pod \"dnsmasq-dns-8476fd89bc-t78wt\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:38.591977 master-0 kubenswrapper[19170]: I0313 01:43:38.591922 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:38.670107 master-0 kubenswrapper[19170]: I0313 01:43:38.670032 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:39.038263 master-0 kubenswrapper[19170]: I0313 01:43:39.038135 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:43:39.165653 master-0 kubenswrapper[19170]: W0313 01:43:39.165572 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode09f95aa_60a6_467e_92a3_e9714796ed46.slice/crio-5df286cddb2ed6431303084c0bfae76a8affd639ace5f6ffff4a775082c12ec7 WatchSource:0}: Error finding container 5df286cddb2ed6431303084c0bfae76a8affd639ace5f6ffff4a775082c12ec7: Status 404 returned error can't find the container with id 5df286cddb2ed6431303084c0bfae76a8affd639ace5f6ffff4a775082c12ec7 Mar 13 01:43:39.170956 master-0 kubenswrapper[19170]: I0313 01:43:39.170724 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:43:39.434354 master-0 kubenswrapper[19170]: I0313 01:43:39.434270 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" event={"ID":"e09f95aa-60a6-467e-92a3-e9714796ed46","Type":"ContainerStarted","Data":"5df286cddb2ed6431303084c0bfae76a8affd639ace5f6ffff4a775082c12ec7"} Mar 13 01:43:39.434354 master-0 kubenswrapper[19170]: I0313 01:43:39.434323 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" event={"ID":"c936ebc7-422c-4dc5-8048-4bb275739793","Type":"ContainerStarted","Data":"59b2a937ec689629f20b1e919229982ba6869f93b92e4cca1d1c775be7af5dca"} Mar 13 01:43:41.227508 master-0 kubenswrapper[19170]: I0313 01:43:41.221328 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:43:41.257431 master-0 kubenswrapper[19170]: I0313 01:43:41.255905 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:43:41.276379 master-0 kubenswrapper[19170]: I0313 01:43:41.276294 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.280194 master-0 kubenswrapper[19170]: I0313 01:43:41.277249 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:43:41.390931 master-0 kubenswrapper[19170]: I0313 01:43:41.389844 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.390931 master-0 kubenswrapper[19170]: I0313 01:43:41.390413 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f62qr\" (UniqueName: \"kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.390931 master-0 kubenswrapper[19170]: I0313 01:43:41.390506 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.513146 master-0 kubenswrapper[19170]: I0313 01:43:41.513027 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.513146 master-0 kubenswrapper[19170]: I0313 01:43:41.513123 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f62qr\" (UniqueName: \"kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.513345 master-0 kubenswrapper[19170]: I0313 01:43:41.513177 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.514041 master-0 kubenswrapper[19170]: I0313 01:43:41.514021 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.515409 master-0 kubenswrapper[19170]: I0313 01:43:41.515362 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.560765 master-0 kubenswrapper[19170]: I0313 01:43:41.560703 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f62qr\" (UniqueName: \"kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr\") pod \"dnsmasq-dns-76849d6659-mnb4b\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.578848 master-0 kubenswrapper[19170]: I0313 01:43:41.578713 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:43:41.610377 master-0 kubenswrapper[19170]: I0313 01:43:41.610326 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:43:41.613063 master-0 kubenswrapper[19170]: I0313 01:43:41.613029 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.619125 master-0 kubenswrapper[19170]: I0313 01:43:41.618895 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:41.733958 master-0 kubenswrapper[19170]: I0313 01:43:41.717306 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.733958 master-0 kubenswrapper[19170]: I0313 01:43:41.717412 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ds5k\" (UniqueName: \"kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.733958 master-0 kubenswrapper[19170]: I0313 01:43:41.717490 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.733958 master-0 kubenswrapper[19170]: I0313 01:43:41.721653 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:43:41.826393 master-0 kubenswrapper[19170]: I0313 01:43:41.826351 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.826739 master-0 kubenswrapper[19170]: I0313 01:43:41.826449 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.826739 master-0 kubenswrapper[19170]: I0313 01:43:41.826500 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ds5k\" (UniqueName: \"kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.832265 master-0 kubenswrapper[19170]: I0313 01:43:41.827543 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.832265 master-0 kubenswrapper[19170]: I0313 01:43:41.828864 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.860882 master-0 kubenswrapper[19170]: I0313 01:43:41.860411 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ds5k\" (UniqueName: \"kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k\") pod \"dnsmasq-dns-6ff8fd9d5c-bpv5h\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:41.946687 master-0 kubenswrapper[19170]: I0313 01:43:41.946525 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:42.374968 master-0 kubenswrapper[19170]: I0313 01:43:42.373776 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:43:42.453142 master-0 kubenswrapper[19170]: I0313 01:43:42.453069 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:43:42.523306 master-0 kubenswrapper[19170]: I0313 01:43:42.523248 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" event={"ID":"17b50b1c-b9dc-4656-a138-f68b9985c8b4","Type":"ContainerStarted","Data":"07872e9b4c0f2a5cf27dcb0fe5d3c9b7116234f3d7024c9accc6286e69d2a269"} Mar 13 01:43:45.278464 master-0 kubenswrapper[19170]: I0313 01:43:45.278263 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 01:43:45.281497 master-0 kubenswrapper[19170]: I0313 01:43:45.279616 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 01:43:45.289654 master-0 kubenswrapper[19170]: I0313 01:43:45.282850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 01:43:45.289654 master-0 kubenswrapper[19170]: I0313 01:43:45.283611 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 01:43:45.289835 master-0 kubenswrapper[19170]: I0313 01:43:45.289788 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 01:43:45.308162 master-0 kubenswrapper[19170]: I0313 01:43:45.308091 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 01:43:45.390033 master-0 kubenswrapper[19170]: I0313 01:43:45.389976 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvs2\" (UniqueName: \"kubernetes.io/projected/f711404d-1d64-49f4-817f-7e7152a54481-kube-api-access-lnvs2\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.390248 master-0 kubenswrapper[19170]: I0313 01:43:45.390064 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-config-data\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.390248 master-0 kubenswrapper[19170]: I0313 01:43:45.390093 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.390248 master-0 kubenswrapper[19170]: I0313 01:43:45.390124 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.390248 master-0 kubenswrapper[19170]: I0313 01:43:45.390200 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-kolla-config\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.391197 master-0 kubenswrapper[19170]: I0313 01:43:45.391150 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 01:43:45.392965 master-0 kubenswrapper[19170]: I0313 01:43:45.392929 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.398180 master-0 kubenswrapper[19170]: I0313 01:43:45.397578 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 01:43:45.407459 master-0 kubenswrapper[19170]: I0313 01:43:45.407405 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 01:43:45.407669 master-0 kubenswrapper[19170]: I0313 01:43:45.407568 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 01:43:45.407865 master-0 kubenswrapper[19170]: I0313 01:43:45.407846 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 01:43:45.407974 master-0 kubenswrapper[19170]: I0313 01:43:45.407958 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 01:43:45.408102 master-0 kubenswrapper[19170]: I0313 01:43:45.408083 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 01:43:45.408230 master-0 kubenswrapper[19170]: I0313 01:43:45.408209 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492215 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9844d28-b0df-4563-a746-76fde502f19a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492320 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492389 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492420 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ef182256-760a-49bd-8d53-8f302c46206f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^486ba11a-2ada-48fa-bcfd-753e6330f452\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492473 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492503 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492565 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzmw\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-kube-api-access-8jzmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492591 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492704 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9844d28-b0df-4563-a746-76fde502f19a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492777 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492821 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492849 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-kolla-config\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492865 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492910 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvs2\" (UniqueName: \"kubernetes.io/projected/f711404d-1d64-49f4-817f-7e7152a54481-kube-api-access-lnvs2\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492934 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.492988 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-config-data\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.495112 master-0 kubenswrapper[19170]: I0313 01:43:45.493862 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-config-data\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.496025 master-0 kubenswrapper[19170]: I0313 01:43:45.495949 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f711404d-1d64-49f4-817f-7e7152a54481-kolla-config\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.503094 master-0 kubenswrapper[19170]: I0313 01:43:45.503057 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.504375 master-0 kubenswrapper[19170]: I0313 01:43:45.504335 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f711404d-1d64-49f4-817f-7e7152a54481-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.512113 master-0 kubenswrapper[19170]: I0313 01:43:45.512074 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvs2\" (UniqueName: \"kubernetes.io/projected/f711404d-1d64-49f4-817f-7e7152a54481-kube-api-access-lnvs2\") pod \"memcached-0\" (UID: \"f711404d-1d64-49f4-817f-7e7152a54481\") " pod="openstack/memcached-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595068 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ef182256-760a-49bd-8d53-8f302c46206f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^486ba11a-2ada-48fa-bcfd-753e6330f452\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595126 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595154 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595175 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzmw\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-kube-api-access-8jzmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595196 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595218 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9844d28-b0df-4563-a746-76fde502f19a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595248 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595274 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595294 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595325 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.595389 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9844d28-b0df-4563-a746-76fde502f19a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.596162 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.597284 master-0 kubenswrapper[19170]: I0313 01:43:45.596923 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.598609 master-0 kubenswrapper[19170]: I0313 01:43:45.597609 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.598609 master-0 kubenswrapper[19170]: I0313 01:43:45.598349 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:45.598609 master-0 kubenswrapper[19170]: I0313 01:43:45.598375 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ef182256-760a-49bd-8d53-8f302c46206f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^486ba11a-2ada-48fa-bcfd-753e6330f452\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/59ea2cd7e74007314dffb9b83b2999debed456c23469b271ac9e5e4f5502e3ae/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.598878 master-0 kubenswrapper[19170]: I0313 01:43:45.598744 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9844d28-b0df-4563-a746-76fde502f19a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.599769 master-0 kubenswrapper[19170]: I0313 01:43:45.599663 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.600382 master-0 kubenswrapper[19170]: I0313 01:43:45.600347 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.615167 master-0 kubenswrapper[19170]: I0313 01:43:45.600718 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9844d28-b0df-4563-a746-76fde502f19a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.615167 master-0 kubenswrapper[19170]: I0313 01:43:45.601538 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9844d28-b0df-4563-a746-76fde502f19a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.615167 master-0 kubenswrapper[19170]: I0313 01:43:45.602231 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.615167 master-0 kubenswrapper[19170]: I0313 01:43:45.607326 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 01:43:45.623773 master-0 kubenswrapper[19170]: I0313 01:43:45.623560 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzmw\" (UniqueName: \"kubernetes.io/projected/e9844d28-b0df-4563-a746-76fde502f19a-kube-api-access-8jzmw\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:45.743733 master-0 kubenswrapper[19170]: I0313 01:43:45.742858 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.747251 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.749603 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.749874 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.750666 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.751155 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.751403 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 01:43:45.751714 master-0 kubenswrapper[19170]: I0313 01:43:45.751537 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 01:43:45.787677 master-0 kubenswrapper[19170]: I0313 01:43:45.783006 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800505 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dc90c5fd-58fd-4117-b64d-551aa9562e3c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dcbaf51f-bae5-43b0-9645-03b4a8414b54\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800562 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800612 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0707bd16-09db-4d83-af8a-f8e7b78fad40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800664 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800693 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800725 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0707bd16-09db-4d83-af8a-f8e7b78fad40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800756 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800804 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800832 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-config-data\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800861 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.801732 master-0 kubenswrapper[19170]: I0313 01:43:45.800876 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csh9m\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-kube-api-access-csh9m\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.902868 master-0 kubenswrapper[19170]: I0313 01:43:45.902735 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.902868 master-0 kubenswrapper[19170]: I0313 01:43:45.902798 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903163 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0707bd16-09db-4d83-af8a-f8e7b78fad40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903262 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903330 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903378 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-config-data\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903437 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903457 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csh9m\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-kube-api-access-csh9m\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903597 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903613 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dc90c5fd-58fd-4117-b64d-551aa9562e3c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dcbaf51f-bae5-43b0-9645-03b4a8414b54\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903638 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.903692 master-0 kubenswrapper[19170]: I0313 01:43:45.903661 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.904197 master-0 kubenswrapper[19170]: I0313 01:43:45.903790 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0707bd16-09db-4d83-af8a-f8e7b78fad40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.904197 master-0 kubenswrapper[19170]: I0313 01:43:45.904013 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.904571 master-0 kubenswrapper[19170]: I0313 01:43:45.904536 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.904672 master-0 kubenswrapper[19170]: I0313 01:43:45.904651 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0707bd16-09db-4d83-af8a-f8e7b78fad40-config-data\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.908607 master-0 kubenswrapper[19170]: I0313 01:43:45.908496 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:45.909083 master-0 kubenswrapper[19170]: I0313 01:43:45.908657 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dc90c5fd-58fd-4117-b64d-551aa9562e3c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dcbaf51f-bae5-43b0-9645-03b4a8414b54\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/ab0bdaab80990ae75ebf31930a89339850744859309e310e598a60a375dba1e1/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.909083 master-0 kubenswrapper[19170]: I0313 01:43:45.909028 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0707bd16-09db-4d83-af8a-f8e7b78fad40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.909269 master-0 kubenswrapper[19170]: I0313 01:43:45.909231 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.920364 master-0 kubenswrapper[19170]: I0313 01:43:45.920327 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0707bd16-09db-4d83-af8a-f8e7b78fad40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.921360 master-0 kubenswrapper[19170]: I0313 01:43:45.921329 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:45.924991 master-0 kubenswrapper[19170]: I0313 01:43:45.924962 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csh9m\" (UniqueName: \"kubernetes.io/projected/0707bd16-09db-4d83-af8a-f8e7b78fad40-kube-api-access-csh9m\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:46.754815 master-0 kubenswrapper[19170]: I0313 01:43:46.754672 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 01:43:46.770577 master-0 kubenswrapper[19170]: I0313 01:43:46.762687 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 01:43:46.770577 master-0 kubenswrapper[19170]: I0313 01:43:46.767427 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 01:43:46.770577 master-0 kubenswrapper[19170]: I0313 01:43:46.767609 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 01:43:46.770577 master-0 kubenswrapper[19170]: I0313 01:43:46.767756 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 01:43:46.779420 master-0 kubenswrapper[19170]: I0313 01:43:46.777599 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827309 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827366 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aa04e1f9-fa52-4ca9-aa81-fe8388275c6f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^79f4ff3b-30bc-4cbe-93f4-fd9dba6d819f\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827503 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzklv\" (UniqueName: \"kubernetes.io/projected/4715cc17-276f-413c-a0e1-98b150ce558a-kube-api-access-kzklv\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827542 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827564 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827650 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827679 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.829741 master-0 kubenswrapper[19170]: I0313 01:43:46.827739 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.943742 master-0 kubenswrapper[19170]: I0313 01:43:46.943603 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.943742 master-0 kubenswrapper[19170]: I0313 01:43:46.943747 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944001 master-0 kubenswrapper[19170]: I0313 01:43:46.943834 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944001 master-0 kubenswrapper[19170]: I0313 01:43:46.943862 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944001 master-0 kubenswrapper[19170]: I0313 01:43:46.943966 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944095 master-0 kubenswrapper[19170]: I0313 01:43:46.944058 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944140 master-0 kubenswrapper[19170]: I0313 01:43:46.944121 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aa04e1f9-fa52-4ca9-aa81-fe8388275c6f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^79f4ff3b-30bc-4cbe-93f4-fd9dba6d819f\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944216 master-0 kubenswrapper[19170]: I0313 01:43:46.944197 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzklv\" (UniqueName: \"kubernetes.io/projected/4715cc17-276f-413c-a0e1-98b150ce558a-kube-api-access-kzklv\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.944906 master-0 kubenswrapper[19170]: I0313 01:43:46.944882 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.945712 master-0 kubenswrapper[19170]: I0313 01:43:46.945615 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.946530 master-0 kubenswrapper[19170]: I0313 01:43:46.946510 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4715cc17-276f-413c-a0e1-98b150ce558a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.956836 master-0 kubenswrapper[19170]: I0313 01:43:46.956706 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4715cc17-276f-413c-a0e1-98b150ce558a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.958262 master-0 kubenswrapper[19170]: I0313 01:43:46.958216 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:46.958328 master-0 kubenswrapper[19170]: I0313 01:43:46.958275 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aa04e1f9-fa52-4ca9-aa81-fe8388275c6f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^79f4ff3b-30bc-4cbe-93f4-fd9dba6d819f\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0f89de614a9eb13bc0144847fdc8ca810c5b0b6bd4f7e39f3b6bbc20566b9237/globalmount\"" pod="openstack/openstack-galera-0" Mar 13 01:43:46.963567 master-0 kubenswrapper[19170]: I0313 01:43:46.963520 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.976221 master-0 kubenswrapper[19170]: I0313 01:43:46.976127 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4715cc17-276f-413c-a0e1-98b150ce558a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:46.976769 master-0 kubenswrapper[19170]: I0313 01:43:46.976729 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzklv\" (UniqueName: \"kubernetes.io/projected/4715cc17-276f-413c-a0e1-98b150ce558a-kube-api-access-kzklv\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:47.272696 master-0 kubenswrapper[19170]: I0313 01:43:47.272620 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ef182256-760a-49bd-8d53-8f302c46206f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^486ba11a-2ada-48fa-bcfd-753e6330f452\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9844d28-b0df-4563-a746-76fde502f19a\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:47.542954 master-0 kubenswrapper[19170]: I0313 01:43:47.542818 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:43:47.955254 master-0 kubenswrapper[19170]: W0313 01:43:47.955191 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod689b3b6a_c7b8_4600_9b61_bfb9ff1eba33.slice/crio-ec03b38fd13e59ba75c91d3857f06091791bc53ca1c0413d0e0e5cf6cbdb4fa5 WatchSource:0}: Error finding container ec03b38fd13e59ba75c91d3857f06091791bc53ca1c0413d0e0e5cf6cbdb4fa5: Status 404 returned error can't find the container with id ec03b38fd13e59ba75c91d3857f06091791bc53ca1c0413d0e0e5cf6cbdb4fa5 Mar 13 01:43:48.021753 master-0 kubenswrapper[19170]: I0313 01:43:48.020811 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 01:43:48.023261 master-0 kubenswrapper[19170]: I0313 01:43:48.023218 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.029485 master-0 kubenswrapper[19170]: I0313 01:43:48.029415 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 01:43:48.035547 master-0 kubenswrapper[19170]: I0313 01:43:48.035500 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 01:43:48.035785 master-0 kubenswrapper[19170]: I0313 01:43:48.035766 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 01:43:48.036757 master-0 kubenswrapper[19170]: I0313 01:43:48.036735 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 01:43:48.198843 master-0 kubenswrapper[19170]: I0313 01:43:48.198754 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.198843 master-0 kubenswrapper[19170]: I0313 01:43:48.198798 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.198843 master-0 kubenswrapper[19170]: I0313 01:43:48.198819 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.198843 master-0 kubenswrapper[19170]: I0313 01:43:48.198842 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1155b05-d04d-4a95-9313-1df4fe7afceb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^46688452-b95b-4bd8-8872-a073a550e059\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.200399 master-0 kubenswrapper[19170]: I0313 01:43:48.198889 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.200399 master-0 kubenswrapper[19170]: I0313 01:43:48.198914 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.200399 master-0 kubenswrapper[19170]: I0313 01:43:48.198940 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6ldz\" (UniqueName: \"kubernetes.io/projected/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kube-api-access-x6ldz\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.200399 master-0 kubenswrapper[19170]: I0313 01:43:48.198969 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301210 master-0 kubenswrapper[19170]: I0313 01:43:48.301069 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301216 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301236 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301254 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301275 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1155b05-d04d-4a95-9313-1df4fe7afceb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^46688452-b95b-4bd8-8872-a073a550e059\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301320 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301343 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.301449 master-0 kubenswrapper[19170]: I0313 01:43:48.301383 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6ldz\" (UniqueName: \"kubernetes.io/projected/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kube-api-access-x6ldz\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.307234 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.307389 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.307492 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.308356 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/649a1497-ac21-43c0-bfdc-c997ee7c8a81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.308404 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.308429 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1155b05-d04d-4a95-9313-1df4fe7afceb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^46688452-b95b-4bd8-8872-a073a550e059\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/0cdb2b0683faf5547d53ee4f58582121cc2bd4e2085e3ac3a11173509a8a1ed5/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.311326 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/649a1497-ac21-43c0-bfdc-c997ee7c8a81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.324278 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/649a1497-ac21-43c0-bfdc-c997ee7c8a81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.331202 master-0 kubenswrapper[19170]: I0313 01:43:48.324611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6ldz\" (UniqueName: \"kubernetes.io/projected/649a1497-ac21-43c0-bfdc-c997ee7c8a81-kube-api-access-x6ldz\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:48.578564 master-0 kubenswrapper[19170]: I0313 01:43:48.577031 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dc90c5fd-58fd-4117-b64d-551aa9562e3c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dcbaf51f-bae5-43b0-9645-03b4a8414b54\") pod \"rabbitmq-server-0\" (UID: \"0707bd16-09db-4d83-af8a-f8e7b78fad40\") " pod="openstack/rabbitmq-server-0" Mar 13 01:43:48.613342 master-0 kubenswrapper[19170]: I0313 01:43:48.613290 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" event={"ID":"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33","Type":"ContainerStarted","Data":"ec03b38fd13e59ba75c91d3857f06091791bc53ca1c0413d0e0e5cf6cbdb4fa5"} Mar 13 01:43:48.776080 master-0 kubenswrapper[19170]: I0313 01:43:48.776001 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 01:43:49.623241 master-0 kubenswrapper[19170]: I0313 01:43:49.623128 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aa04e1f9-fa52-4ca9-aa81-fe8388275c6f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^79f4ff3b-30bc-4cbe-93f4-fd9dba6d819f\") pod \"openstack-galera-0\" (UID: \"4715cc17-276f-413c-a0e1-98b150ce558a\") " pod="openstack/openstack-galera-0" Mar 13 01:43:49.827387 master-0 kubenswrapper[19170]: I0313 01:43:49.827260 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 01:43:50.688137 master-0 kubenswrapper[19170]: I0313 01:43:50.688076 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1155b05-d04d-4a95-9313-1df4fe7afceb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^46688452-b95b-4bd8-8872-a073a550e059\") pod \"openstack-cell1-galera-0\" (UID: \"649a1497-ac21-43c0-bfdc-c997ee7c8a81\") " pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:50.800149 master-0 kubenswrapper[19170]: I0313 01:43:50.800062 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 01:43:51.119192 master-0 kubenswrapper[19170]: I0313 01:43:51.118734 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gvjkt"] Mar 13 01:43:51.122648 master-0 kubenswrapper[19170]: I0313 01:43:51.122001 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.124620 master-0 kubenswrapper[19170]: I0313 01:43:51.124507 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 01:43:51.124620 master-0 kubenswrapper[19170]: I0313 01:43:51.124545 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 01:43:51.137198 master-0 kubenswrapper[19170]: I0313 01:43:51.137108 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-p9j9l"] Mar 13 01:43:51.140599 master-0 kubenswrapper[19170]: I0313 01:43:51.140566 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.155432 master-0 kubenswrapper[19170]: I0313 01:43:51.155239 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt"] Mar 13 01:43:51.170806 master-0 kubenswrapper[19170]: I0313 01:43:51.170749 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9j9l"] Mar 13 01:43:51.192994 master-0 kubenswrapper[19170]: I0313 01:43:51.192939 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-scripts\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193005 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzj9q\" (UniqueName: \"kubernetes.io/projected/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-kube-api-access-mzj9q\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193030 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-log-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193046 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193066 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193091 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-combined-ca-bundle\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193114 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8290d408-116c-4ac7-9b8b-9f8e50f4355d-scripts\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193138 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9v2\" (UniqueName: \"kubernetes.io/projected/8290d408-116c-4ac7-9b8b-9f8e50f4355d-kube-api-access-2z9v2\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193189 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-log\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193210 master-0 kubenswrapper[19170]: I0313 01:43:51.193205 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-lib\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193467 master-0 kubenswrapper[19170]: I0313 01:43:51.193252 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-etc-ovs\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193467 master-0 kubenswrapper[19170]: I0313 01:43:51.193271 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-run\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.193467 master-0 kubenswrapper[19170]: I0313 01:43:51.193288 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-ovn-controller-tls-certs\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.294431 master-0 kubenswrapper[19170]: I0313 01:43:51.294368 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-etc-ovs\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.294657 master-0 kubenswrapper[19170]: I0313 01:43:51.294486 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-run\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.294657 master-0 kubenswrapper[19170]: I0313 01:43:51.294527 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-ovn-controller-tls-certs\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295354 master-0 kubenswrapper[19170]: I0313 01:43:51.295288 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-etc-ovs\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.295410 master-0 kubenswrapper[19170]: I0313 01:43:51.295359 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-scripts\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295453 master-0 kubenswrapper[19170]: I0313 01:43:51.295420 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzj9q\" (UniqueName: \"kubernetes.io/projected/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-kube-api-access-mzj9q\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295453 master-0 kubenswrapper[19170]: I0313 01:43:51.295444 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-log-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295517 master-0 kubenswrapper[19170]: I0313 01:43:51.295460 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295517 master-0 kubenswrapper[19170]: I0313 01:43:51.295509 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295574 master-0 kubenswrapper[19170]: I0313 01:43:51.295532 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-combined-ca-bundle\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.295652 master-0 kubenswrapper[19170]: I0313 01:43:51.295616 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8290d408-116c-4ac7-9b8b-9f8e50f4355d-scripts\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.295765 master-0 kubenswrapper[19170]: I0313 01:43:51.295703 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9v2\" (UniqueName: \"kubernetes.io/projected/8290d408-116c-4ac7-9b8b-9f8e50f4355d-kube-api-access-2z9v2\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.295821 master-0 kubenswrapper[19170]: I0313 01:43:51.295791 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-log\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.295821 master-0 kubenswrapper[19170]: I0313 01:43:51.295810 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-lib\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.296177 master-0 kubenswrapper[19170]: I0313 01:43:51.296154 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-lib\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.296647 master-0 kubenswrapper[19170]: I0313 01:43:51.296558 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.298073 master-0 kubenswrapper[19170]: I0313 01:43:51.297044 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-log-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.298073 master-0 kubenswrapper[19170]: I0313 01:43:51.297136 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-var-run-ovn\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.298073 master-0 kubenswrapper[19170]: I0313 01:43:51.295474 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-run\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.299386 master-0 kubenswrapper[19170]: I0313 01:43:51.299295 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8290d408-116c-4ac7-9b8b-9f8e50f4355d-scripts\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.299443 master-0 kubenswrapper[19170]: I0313 01:43:51.299391 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8290d408-116c-4ac7-9b8b-9f8e50f4355d-var-log\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.300051 master-0 kubenswrapper[19170]: I0313 01:43:51.300004 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-scripts\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.300286 master-0 kubenswrapper[19170]: I0313 01:43:51.300251 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-ovn-controller-tls-certs\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.303832 master-0 kubenswrapper[19170]: I0313 01:43:51.302092 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-combined-ca-bundle\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.319507 master-0 kubenswrapper[19170]: I0313 01:43:51.319323 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9v2\" (UniqueName: \"kubernetes.io/projected/8290d408-116c-4ac7-9b8b-9f8e50f4355d-kube-api-access-2z9v2\") pod \"ovn-controller-ovs-p9j9l\" (UID: \"8290d408-116c-4ac7-9b8b-9f8e50f4355d\") " pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:51.319747 master-0 kubenswrapper[19170]: I0313 01:43:51.319711 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzj9q\" (UniqueName: \"kubernetes.io/projected/db2c7f91-4b25-4e56-9d4a-ce6a885121f9-kube-api-access-mzj9q\") pod \"ovn-controller-gvjkt\" (UID: \"db2c7f91-4b25-4e56-9d4a-ce6a885121f9\") " pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.451699 master-0 kubenswrapper[19170]: I0313 01:43:51.451592 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt" Mar 13 01:43:51.461739 master-0 kubenswrapper[19170]: I0313 01:43:51.461588 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:43:52.196466 master-0 kubenswrapper[19170]: I0313 01:43:52.196406 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 01:43:52.198876 master-0 kubenswrapper[19170]: I0313 01:43:52.198836 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.202299 master-0 kubenswrapper[19170]: I0313 01:43:52.202258 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 01:43:52.202440 master-0 kubenswrapper[19170]: I0313 01:43:52.202278 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 01:43:52.204315 master-0 kubenswrapper[19170]: I0313 01:43:52.202412 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 01:43:52.204315 master-0 kubenswrapper[19170]: I0313 01:43:52.203192 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 01:43:52.221100 master-0 kubenswrapper[19170]: I0313 01:43:52.219592 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 01:43:52.225218 master-0 kubenswrapper[19170]: I0313 01:43:52.225153 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225302 master-0 kubenswrapper[19170]: I0313 01:43:52.225231 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225302 master-0 kubenswrapper[19170]: I0313 01:43:52.225264 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225394 master-0 kubenswrapper[19170]: I0313 01:43:52.225338 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5a9a4c32-3f32-4f88-8d49-99a35d347b15\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ed97b5ff-bc31-4626-92d1-ac6df5fef269\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225451 master-0 kubenswrapper[19170]: I0313 01:43:52.225390 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9bgg\" (UniqueName: \"kubernetes.io/projected/841fec7e-d35c-45d6-8b12-5f5d51270e1b-kube-api-access-s9bgg\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225451 master-0 kubenswrapper[19170]: I0313 01:43:52.225421 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225451 master-0 kubenswrapper[19170]: I0313 01:43:52.225448 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.225589 master-0 kubenswrapper[19170]: I0313 01:43:52.225564 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.327556 master-0 kubenswrapper[19170]: I0313 01:43:52.327477 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.327556 master-0 kubenswrapper[19170]: I0313 01:43:52.327564 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.327926 master-0 kubenswrapper[19170]: I0313 01:43:52.327856 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328129 master-0 kubenswrapper[19170]: I0313 01:43:52.328095 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5a9a4c32-3f32-4f88-8d49-99a35d347b15\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ed97b5ff-bc31-4626-92d1-ac6df5fef269\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328250 master-0 kubenswrapper[19170]: I0313 01:43:52.328214 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9bgg\" (UniqueName: \"kubernetes.io/projected/841fec7e-d35c-45d6-8b12-5f5d51270e1b-kube-api-access-s9bgg\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328338 master-0 kubenswrapper[19170]: I0313 01:43:52.328249 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328338 master-0 kubenswrapper[19170]: I0313 01:43:52.328279 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328432 master-0 kubenswrapper[19170]: I0313 01:43:52.328391 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.328478 master-0 kubenswrapper[19170]: I0313 01:43:52.328450 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.329752 master-0 kubenswrapper[19170]: I0313 01:43:52.329166 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.330252 master-0 kubenswrapper[19170]: I0313 01:43:52.330193 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841fec7e-d35c-45d6-8b12-5f5d51270e1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.331353 master-0 kubenswrapper[19170]: I0313 01:43:52.331321 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:52.331438 master-0 kubenswrapper[19170]: I0313 01:43:52.331352 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5a9a4c32-3f32-4f88-8d49-99a35d347b15\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ed97b5ff-bc31-4626-92d1-ac6df5fef269\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4261964f326ebca14feda841c55ffb401039a7d54cfba14b206d4b3a988a2ad5/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.334476 master-0 kubenswrapper[19170]: I0313 01:43:52.334439 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.335530 master-0 kubenswrapper[19170]: I0313 01:43:52.335496 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.337958 master-0 kubenswrapper[19170]: I0313 01:43:52.336989 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841fec7e-d35c-45d6-8b12-5f5d51270e1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:52.346765 master-0 kubenswrapper[19170]: I0313 01:43:52.345509 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9bgg\" (UniqueName: \"kubernetes.io/projected/841fec7e-d35c-45d6-8b12-5f5d51270e1b-kube-api-access-s9bgg\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:53.828831 master-0 kubenswrapper[19170]: I0313 01:43:53.828747 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5a9a4c32-3f32-4f88-8d49-99a35d347b15\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ed97b5ff-bc31-4626-92d1-ac6df5fef269\") pod \"ovsdbserver-nb-0\" (UID: \"841fec7e-d35c-45d6-8b12-5f5d51270e1b\") " pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:54.026438 master-0 kubenswrapper[19170]: I0313 01:43:54.026208 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 01:43:55.775191 master-0 kubenswrapper[19170]: I0313 01:43:55.775029 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 01:43:55.777726 master-0 kubenswrapper[19170]: I0313 01:43:55.777659 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.780337 master-0 kubenswrapper[19170]: I0313 01:43:55.780278 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 01:43:55.781175 master-0 kubenswrapper[19170]: I0313 01:43:55.781103 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 01:43:55.781175 master-0 kubenswrapper[19170]: I0313 01:43:55.781149 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 01:43:55.797354 master-0 kubenswrapper[19170]: I0313 01:43:55.797276 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 01:43:55.924966 master-0 kubenswrapper[19170]: I0313 01:43:55.924883 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925313 master-0 kubenswrapper[19170]: I0313 01:43:55.925057 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925313 master-0 kubenswrapper[19170]: I0313 01:43:55.925103 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-661368bb-fd70-4139-b220-523c1707d8f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^55292b96-5c65-4061-a103-a256777e7fac\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925313 master-0 kubenswrapper[19170]: I0313 01:43:55.925137 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-config\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925313 master-0 kubenswrapper[19170]: I0313 01:43:55.925195 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925313 master-0 kubenswrapper[19170]: I0313 01:43:55.925244 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925677 master-0 kubenswrapper[19170]: I0313 01:43:55.925378 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l549x\" (UniqueName: \"kubernetes.io/projected/d1b4c1ba-50fb-4b80-82bc-d815b452c580-kube-api-access-l549x\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:55.925677 master-0 kubenswrapper[19170]: I0313 01:43:55.925452 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.027859 master-0 kubenswrapper[19170]: I0313 01:43:56.027732 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.027859 master-0 kubenswrapper[19170]: I0313 01:43:56.027803 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.027859 master-0 kubenswrapper[19170]: I0313 01:43:56.027861 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l549x\" (UniqueName: \"kubernetes.io/projected/d1b4c1ba-50fb-4b80-82bc-d815b452c580-kube-api-access-l549x\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028193 master-0 kubenswrapper[19170]: I0313 01:43:56.027902 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028193 master-0 kubenswrapper[19170]: I0313 01:43:56.027954 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028193 master-0 kubenswrapper[19170]: I0313 01:43:56.027994 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028193 master-0 kubenswrapper[19170]: I0313 01:43:56.028019 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-661368bb-fd70-4139-b220-523c1707d8f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^55292b96-5c65-4061-a103-a256777e7fac\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028193 master-0 kubenswrapper[19170]: I0313 01:43:56.028037 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-config\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.028963 master-0 kubenswrapper[19170]: I0313 01:43:56.028938 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-config\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.029260 master-0 kubenswrapper[19170]: I0313 01:43:56.029225 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.031055 master-0 kubenswrapper[19170]: I0313 01:43:56.030994 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1b4c1ba-50fb-4b80-82bc-d815b452c580-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.033973 master-0 kubenswrapper[19170]: I0313 01:43:56.033944 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:43:56.034094 master-0 kubenswrapper[19170]: I0313 01:43:56.034012 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-661368bb-fd70-4139-b220-523c1707d8f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^55292b96-5c65-4061-a103-a256777e7fac\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/ee59c5ab616a72e5f6a129cefea6f24d90370f185e1f433c3a9c1e6d34372b2d/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.034186 master-0 kubenswrapper[19170]: I0313 01:43:56.034156 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.034958 master-0 kubenswrapper[19170]: I0313 01:43:56.034918 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.038445 master-0 kubenswrapper[19170]: I0313 01:43:56.038380 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b4c1ba-50fb-4b80-82bc-d815b452c580-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:56.045668 master-0 kubenswrapper[19170]: I0313 01:43:56.045140 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l549x\" (UniqueName: \"kubernetes.io/projected/d1b4c1ba-50fb-4b80-82bc-d815b452c580-kube-api-access-l549x\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:57.065861 master-0 kubenswrapper[19170]: I0313 01:43:57.065775 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 01:43:57.446019 master-0 kubenswrapper[19170]: I0313 01:43:57.445956 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-661368bb-fd70-4139-b220-523c1707d8f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^55292b96-5c65-4061-a103-a256777e7fac\") pod \"ovsdbserver-sb-0\" (UID: \"d1b4c1ba-50fb-4b80-82bc-d815b452c580\") " pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:57.464680 master-0 kubenswrapper[19170]: I0313 01:43:57.462367 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 01:43:57.487432 master-0 kubenswrapper[19170]: I0313 01:43:57.487376 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 01:43:57.549231 master-0 kubenswrapper[19170]: W0313 01:43:57.549168 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9844d28_b0df_4563_a746_76fde502f19a.slice/crio-7875c142ce877136524816af56b4b5da7f1d0d361b83caef6554ad166736424a WatchSource:0}: Error finding container 7875c142ce877136524816af56b4b5da7f1d0d361b83caef6554ad166736424a: Status 404 returned error can't find the container with id 7875c142ce877136524816af56b4b5da7f1d0d361b83caef6554ad166736424a Mar 13 01:43:57.667460 master-0 kubenswrapper[19170]: I0313 01:43:57.667402 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 01:43:57.728914 master-0 kubenswrapper[19170]: I0313 01:43:57.728851 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9844d28-b0df-4563-a746-76fde502f19a","Type":"ContainerStarted","Data":"7875c142ce877136524816af56b4b5da7f1d0d361b83caef6554ad166736424a"} Mar 13 01:43:57.732731 master-0 kubenswrapper[19170]: I0313 01:43:57.732688 19170 generic.go:334] "Generic (PLEG): container finished" podID="e09f95aa-60a6-467e-92a3-e9714796ed46" containerID="8550d2b229ebc8a72bb2017705ef785ac6593f759db57b148f2d3fb553d3b7ca" exitCode=0 Mar 13 01:43:57.733057 master-0 kubenswrapper[19170]: I0313 01:43:57.732800 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" event={"ID":"e09f95aa-60a6-467e-92a3-e9714796ed46","Type":"ContainerDied","Data":"8550d2b229ebc8a72bb2017705ef785ac6593f759db57b148f2d3fb553d3b7ca"} Mar 13 01:43:57.736679 master-0 kubenswrapper[19170]: I0313 01:43:57.736574 19170 generic.go:334] "Generic (PLEG): container finished" podID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerID="9f8bc1af3d40418fe8dbec896d01e1a9253cf92879349f4f21f0ecfbc997041f" exitCode=0 Mar 13 01:43:57.736761 master-0 kubenswrapper[19170]: I0313 01:43:57.736706 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" event={"ID":"17b50b1c-b9dc-4656-a138-f68b9985c8b4","Type":"ContainerDied","Data":"9f8bc1af3d40418fe8dbec896d01e1a9253cf92879349f4f21f0ecfbc997041f"} Mar 13 01:43:57.738081 master-0 kubenswrapper[19170]: I0313 01:43:57.738056 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f711404d-1d64-49f4-817f-7e7152a54481","Type":"ContainerStarted","Data":"7f50ccfacafebe169027fd52f73f8fc115b815bceb22d67e9109f119d98e2b40"} Mar 13 01:43:57.740150 master-0 kubenswrapper[19170]: I0313 01:43:57.740113 19170 generic.go:334] "Generic (PLEG): container finished" podID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerID="918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e" exitCode=0 Mar 13 01:43:57.740250 master-0 kubenswrapper[19170]: I0313 01:43:57.740161 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" event={"ID":"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33","Type":"ContainerDied","Data":"918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e"} Mar 13 01:43:57.747827 master-0 kubenswrapper[19170]: I0313 01:43:57.747782 19170 generic.go:334] "Generic (PLEG): container finished" podID="c936ebc7-422c-4dc5-8048-4bb275739793" containerID="0048d4bb42948e52d5f1585e163c2ebafbf1c8bcdb46130c95accafdfe4bc12d" exitCode=0 Mar 13 01:43:57.753866 master-0 kubenswrapper[19170]: I0313 01:43:57.752118 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" event={"ID":"c936ebc7-422c-4dc5-8048-4bb275739793","Type":"ContainerDied","Data":"0048d4bb42948e52d5f1585e163c2ebafbf1c8bcdb46130c95accafdfe4bc12d"} Mar 13 01:43:57.759334 master-0 kubenswrapper[19170]: I0313 01:43:57.758929 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0707bd16-09db-4d83-af8a-f8e7b78fad40","Type":"ContainerStarted","Data":"8df698a2a601ab3eed85667a884efff181c88ad85b6bfc01b7e232b97556fcb6"} Mar 13 01:43:57.937814 master-0 kubenswrapper[19170]: I0313 01:43:57.937774 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 01:43:57.946431 master-0 kubenswrapper[19170]: I0313 01:43:57.946395 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt"] Mar 13 01:43:57.966537 master-0 kubenswrapper[19170]: W0313 01:43:57.966488 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb2c7f91_4b25_4e56_9d4a_ce6a885121f9.slice/crio-53a85b25ab44f653107a6b8504df1373174271221aaa2a04a42c9ac0c0fcffb0 WatchSource:0}: Error finding container 53a85b25ab44f653107a6b8504df1373174271221aaa2a04a42c9ac0c0fcffb0: Status 404 returned error can't find the container with id 53a85b25ab44f653107a6b8504df1373174271221aaa2a04a42c9ac0c0fcffb0 Mar 13 01:43:58.035083 master-0 kubenswrapper[19170]: I0313 01:43:58.035045 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 01:43:58.043432 master-0 kubenswrapper[19170]: W0313 01:43:58.043389 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4715cc17_276f_413c_a0e1_98b150ce558a.slice/crio-f3e09aaadf197357500f39146ee386c66437ed6f83f7253a19b644bc74b5b66f WatchSource:0}: Error finding container f3e09aaadf197357500f39146ee386c66437ed6f83f7253a19b644bc74b5b66f: Status 404 returned error can't find the container with id f3e09aaadf197357500f39146ee386c66437ed6f83f7253a19b644bc74b5b66f Mar 13 01:43:58.181211 master-0 kubenswrapper[19170]: I0313 01:43:58.180268 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 01:43:58.548818 master-0 kubenswrapper[19170]: I0313 01:43:58.542122 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 01:43:58.772436 master-0 kubenswrapper[19170]: I0313 01:43:58.772316 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4715cc17-276f-413c-a0e1-98b150ce558a","Type":"ContainerStarted","Data":"f3e09aaadf197357500f39146ee386c66437ed6f83f7253a19b644bc74b5b66f"} Mar 13 01:43:58.774148 master-0 kubenswrapper[19170]: I0313 01:43:58.774089 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"841fec7e-d35c-45d6-8b12-5f5d51270e1b","Type":"ContainerStarted","Data":"b121d5deaffafe8727c8995bf72cee14f5cf9a83bb1236d107d00e4dfa758c75"} Mar 13 01:43:58.775614 master-0 kubenswrapper[19170]: I0313 01:43:58.775565 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"649a1497-ac21-43c0-bfdc-c997ee7c8a81","Type":"ContainerStarted","Data":"a53fbcf491a9081b9dd37b4cf225abc955605eba14b09cf6b51d40f93ae960b0"} Mar 13 01:43:58.777160 master-0 kubenswrapper[19170]: I0313 01:43:58.777128 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt" event={"ID":"db2c7f91-4b25-4e56-9d4a-ce6a885121f9","Type":"ContainerStarted","Data":"53a85b25ab44f653107a6b8504df1373174271221aaa2a04a42c9ac0c0fcffb0"} Mar 13 01:43:58.781310 master-0 kubenswrapper[19170]: I0313 01:43:58.781280 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" event={"ID":"17b50b1c-b9dc-4656-a138-f68b9985c8b4","Type":"ContainerStarted","Data":"781c7ff54b544471e5d81d4ca9de014fcc8215d43fe4bb0e2ec91e6f4dc241bc"} Mar 13 01:43:58.781974 master-0 kubenswrapper[19170]: I0313 01:43:58.781944 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:43:58.798673 master-0 kubenswrapper[19170]: I0313 01:43:58.798602 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" event={"ID":"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33","Type":"ContainerStarted","Data":"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155"} Mar 13 01:43:58.798846 master-0 kubenswrapper[19170]: I0313 01:43:58.798806 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:43:58.829929 master-0 kubenswrapper[19170]: I0313 01:43:58.829838 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" podStartSLOduration=3.356564958 podStartE2EDuration="17.829821121s" podCreationTimestamp="2026-03-13 01:43:41 +0000 UTC" firstStartedPulling="2026-03-13 01:43:42.395570023 +0000 UTC m=+1483.203690973" lastFinishedPulling="2026-03-13 01:43:56.868826176 +0000 UTC m=+1497.676947136" observedRunningTime="2026-03-13 01:43:58.827929638 +0000 UTC m=+1499.636050608" watchObservedRunningTime="2026-03-13 01:43:58.829821121 +0000 UTC m=+1499.637942091" Mar 13 01:43:58.850363 master-0 kubenswrapper[19170]: I0313 01:43:58.850165 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" podStartSLOduration=8.938993283 podStartE2EDuration="17.850148034s" podCreationTimestamp="2026-03-13 01:43:41 +0000 UTC" firstStartedPulling="2026-03-13 01:43:47.958067466 +0000 UTC m=+1488.766188426" lastFinishedPulling="2026-03-13 01:43:56.869222217 +0000 UTC m=+1497.677343177" observedRunningTime="2026-03-13 01:43:58.845825962 +0000 UTC m=+1499.653946922" watchObservedRunningTime="2026-03-13 01:43:58.850148034 +0000 UTC m=+1499.658268994" Mar 13 01:43:59.188113 master-0 kubenswrapper[19170]: I0313 01:43:59.188048 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-p9j9l"] Mar 13 01:43:59.217268 master-0 kubenswrapper[19170]: W0313 01:43:59.217025 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b4c1ba_50fb_4b80_82bc_d815b452c580.slice/crio-f3bfadd82040fe843485fafc7f1fa1b774ce1b83005428213808fb9d7bf6763a WatchSource:0}: Error finding container f3bfadd82040fe843485fafc7f1fa1b774ce1b83005428213808fb9d7bf6763a: Status 404 returned error can't find the container with id f3bfadd82040fe843485fafc7f1fa1b774ce1b83005428213808fb9d7bf6763a Mar 13 01:43:59.223158 master-0 kubenswrapper[19170]: W0313 01:43:59.223118 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8290d408_116c_4ac7_9b8b_9f8e50f4355d.slice/crio-36d472bc352f7bf64b365b85bdbd91f31104bc2ff68fa29ddc0a02cbdca9833d WatchSource:0}: Error finding container 36d472bc352f7bf64b365b85bdbd91f31104bc2ff68fa29ddc0a02cbdca9833d: Status 404 returned error can't find the container with id 36d472bc352f7bf64b365b85bdbd91f31104bc2ff68fa29ddc0a02cbdca9833d Mar 13 01:43:59.407049 master-0 kubenswrapper[19170]: I0313 01:43:59.406725 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:59.412497 master-0 kubenswrapper[19170]: I0313 01:43:59.412063 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:59.552490 master-0 kubenswrapper[19170]: I0313 01:43:59.551594 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config\") pod \"e09f95aa-60a6-467e-92a3-e9714796ed46\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " Mar 13 01:43:59.552490 master-0 kubenswrapper[19170]: I0313 01:43:59.551735 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config\") pod \"c936ebc7-422c-4dc5-8048-4bb275739793\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " Mar 13 01:43:59.552490 master-0 kubenswrapper[19170]: I0313 01:43:59.551805 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc\") pod \"e09f95aa-60a6-467e-92a3-e9714796ed46\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " Mar 13 01:43:59.552490 master-0 kubenswrapper[19170]: I0313 01:43:59.551896 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxhw\" (UniqueName: \"kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw\") pod \"e09f95aa-60a6-467e-92a3-e9714796ed46\" (UID: \"e09f95aa-60a6-467e-92a3-e9714796ed46\") " Mar 13 01:43:59.552490 master-0 kubenswrapper[19170]: I0313 01:43:59.551934 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2m94\" (UniqueName: \"kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94\") pod \"c936ebc7-422c-4dc5-8048-4bb275739793\" (UID: \"c936ebc7-422c-4dc5-8048-4bb275739793\") " Mar 13 01:43:59.587416 master-0 kubenswrapper[19170]: I0313 01:43:59.587134 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config" (OuterVolumeSpecName: "config") pod "c936ebc7-422c-4dc5-8048-4bb275739793" (UID: "c936ebc7-422c-4dc5-8048-4bb275739793"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:43:59.609174 master-0 kubenswrapper[19170]: I0313 01:43:59.609078 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94" (OuterVolumeSpecName: "kube-api-access-b2m94") pod "c936ebc7-422c-4dc5-8048-4bb275739793" (UID: "c936ebc7-422c-4dc5-8048-4bb275739793"). InnerVolumeSpecName "kube-api-access-b2m94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:43:59.609295 master-0 kubenswrapper[19170]: I0313 01:43:59.609243 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw" (OuterVolumeSpecName: "kube-api-access-tqxhw") pod "e09f95aa-60a6-467e-92a3-e9714796ed46" (UID: "e09f95aa-60a6-467e-92a3-e9714796ed46"). InnerVolumeSpecName "kube-api-access-tqxhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:43:59.611339 master-0 kubenswrapper[19170]: I0313 01:43:59.611293 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e09f95aa-60a6-467e-92a3-e9714796ed46" (UID: "e09f95aa-60a6-467e-92a3-e9714796ed46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:43:59.618471 master-0 kubenswrapper[19170]: I0313 01:43:59.618433 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config" (OuterVolumeSpecName: "config") pod "e09f95aa-60a6-467e-92a3-e9714796ed46" (UID: "e09f95aa-60a6-467e-92a3-e9714796ed46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:43:59.656563 master-0 kubenswrapper[19170]: I0313 01:43:59.656512 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c936ebc7-422c-4dc5-8048-4bb275739793-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:43:59.656807 master-0 kubenswrapper[19170]: I0313 01:43:59.656789 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:43:59.656982 master-0 kubenswrapper[19170]: I0313 01:43:59.656851 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqxhw\" (UniqueName: \"kubernetes.io/projected/e09f95aa-60a6-467e-92a3-e9714796ed46-kube-api-access-tqxhw\") on node \"master-0\" DevicePath \"\"" Mar 13 01:43:59.656982 master-0 kubenswrapper[19170]: I0313 01:43:59.656979 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2m94\" (UniqueName: \"kubernetes.io/projected/c936ebc7-422c-4dc5-8048-4bb275739793-kube-api-access-b2m94\") on node \"master-0\" DevicePath \"\"" Mar 13 01:43:59.657081 master-0 kubenswrapper[19170]: I0313 01:43:59.656989 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e09f95aa-60a6-467e-92a3-e9714796ed46-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:43:59.812380 master-0 kubenswrapper[19170]: I0313 01:43:59.812297 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" event={"ID":"c936ebc7-422c-4dc5-8048-4bb275739793","Type":"ContainerDied","Data":"59b2a937ec689629f20b1e919229982ba6869f93b92e4cca1d1c775be7af5dca"} Mar 13 01:43:59.812730 master-0 kubenswrapper[19170]: I0313 01:43:59.812410 19170 scope.go:117] "RemoveContainer" containerID="0048d4bb42948e52d5f1585e163c2ebafbf1c8bcdb46130c95accafdfe4bc12d" Mar 13 01:43:59.812730 master-0 kubenswrapper[19170]: I0313 01:43:59.812556 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-dkz47" Mar 13 01:43:59.817508 master-0 kubenswrapper[19170]: I0313 01:43:59.817470 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9j9l" event={"ID":"8290d408-116c-4ac7-9b8b-9f8e50f4355d","Type":"ContainerStarted","Data":"36d472bc352f7bf64b365b85bdbd91f31104bc2ff68fa29ddc0a02cbdca9833d"} Mar 13 01:43:59.819977 master-0 kubenswrapper[19170]: I0313 01:43:59.819923 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" event={"ID":"e09f95aa-60a6-467e-92a3-e9714796ed46","Type":"ContainerDied","Data":"5df286cddb2ed6431303084c0bfae76a8affd639ace5f6ffff4a775082c12ec7"} Mar 13 01:43:59.820055 master-0 kubenswrapper[19170]: I0313 01:43:59.820018 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-t78wt" Mar 13 01:43:59.823888 master-0 kubenswrapper[19170]: I0313 01:43:59.823818 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d1b4c1ba-50fb-4b80-82bc-d815b452c580","Type":"ContainerStarted","Data":"f3bfadd82040fe843485fafc7f1fa1b774ce1b83005428213808fb9d7bf6763a"} Mar 13 01:43:59.830618 master-0 kubenswrapper[19170]: I0313 01:43:59.830564 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f711404d-1d64-49f4-817f-7e7152a54481","Type":"ContainerStarted","Data":"bfee3326269e9f10f7093dc2daf2889f1c8f250f720e6c37e175702c4d93803b"} Mar 13 01:43:59.831244 master-0 kubenswrapper[19170]: I0313 01:43:59.831163 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 01:43:59.895286 master-0 kubenswrapper[19170]: I0313 01:43:59.895070 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.77753316 podStartE2EDuration="14.895044537s" podCreationTimestamp="2026-03-13 01:43:45 +0000 UTC" firstStartedPulling="2026-03-13 01:43:57.148288003 +0000 UTC m=+1497.956408963" lastFinishedPulling="2026-03-13 01:43:59.26579939 +0000 UTC m=+1500.073920340" observedRunningTime="2026-03-13 01:43:59.875558568 +0000 UTC m=+1500.683679538" watchObservedRunningTime="2026-03-13 01:43:59.895044537 +0000 UTC m=+1500.703165497" Mar 13 01:43:59.983211 master-0 kubenswrapper[19170]: I0313 01:43:59.983144 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:43:59.990917 master-0 kubenswrapper[19170]: I0313 01:43:59.990213 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-dkz47"] Mar 13 01:44:00.004682 master-0 kubenswrapper[19170]: I0313 01:44:00.004612 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:44:00.014494 master-0 kubenswrapper[19170]: I0313 01:44:00.014462 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-t78wt"] Mar 13 01:44:01.436353 master-0 kubenswrapper[19170]: I0313 01:44:01.436270 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c936ebc7-422c-4dc5-8048-4bb275739793" path="/var/lib/kubelet/pods/c936ebc7-422c-4dc5-8048-4bb275739793/volumes" Mar 13 01:44:01.437071 master-0 kubenswrapper[19170]: I0313 01:44:01.437036 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e09f95aa-60a6-467e-92a3-e9714796ed46" path="/var/lib/kubelet/pods/e09f95aa-60a6-467e-92a3-e9714796ed46/volumes" Mar 13 01:44:02.642552 master-0 kubenswrapper[19170]: I0313 01:44:02.642204 19170 scope.go:117] "RemoveContainer" containerID="8550d2b229ebc8a72bb2017705ef785ac6593f759db57b148f2d3fb553d3b7ca" Mar 13 01:44:05.609863 master-0 kubenswrapper[19170]: I0313 01:44:05.608852 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 01:44:05.896142 master-0 kubenswrapper[19170]: I0313 01:44:05.896038 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"649a1497-ac21-43c0-bfdc-c997ee7c8a81","Type":"ContainerStarted","Data":"4c0111703c151c36c74000c3b6b5b1d57f6d860d2889c1eeafd83fa91c100684"} Mar 13 01:44:05.899960 master-0 kubenswrapper[19170]: I0313 01:44:05.899907 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt" event={"ID":"db2c7f91-4b25-4e56-9d4a-ce6a885121f9","Type":"ContainerStarted","Data":"3cd5b9801451c59e26d27103c89aec33337c021d0990074f1a2213a35bf87369"} Mar 13 01:44:05.900507 master-0 kubenswrapper[19170]: I0313 01:44:05.900479 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gvjkt" Mar 13 01:44:05.902610 master-0 kubenswrapper[19170]: I0313 01:44:05.902568 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9j9l" event={"ID":"8290d408-116c-4ac7-9b8b-9f8e50f4355d","Type":"ContainerStarted","Data":"53ab5c7f7fcdb440c384858cb0002ee70fd29cf0568599938a9ca48b2a09a266"} Mar 13 01:44:05.906652 master-0 kubenswrapper[19170]: I0313 01:44:05.906281 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4715cc17-276f-413c-a0e1-98b150ce558a","Type":"ContainerStarted","Data":"69849c65855087b0288027a28805264109b2098057cc427e1730d599acdd3076"} Mar 13 01:44:05.909355 master-0 kubenswrapper[19170]: I0313 01:44:05.909316 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"841fec7e-d35c-45d6-8b12-5f5d51270e1b","Type":"ContainerStarted","Data":"d6c242332d7a6d1537066b515d7a201235d89ec1dd9572c9c9ba55aba904b9c1"} Mar 13 01:44:05.912024 master-0 kubenswrapper[19170]: I0313 01:44:05.910955 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d1b4c1ba-50fb-4b80-82bc-d815b452c580","Type":"ContainerStarted","Data":"63cd694ee5ce3e540a975c299b9185e307e1ab2c7df30e71dfb1dbe754d414dd"} Mar 13 01:44:05.979024 master-0 kubenswrapper[19170]: I0313 01:44:05.977954 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gvjkt" podStartSLOduration=7.894907568 podStartE2EDuration="14.977932159s" podCreationTimestamp="2026-03-13 01:43:51 +0000 UTC" firstStartedPulling="2026-03-13 01:43:57.974646227 +0000 UTC m=+1498.782767187" lastFinishedPulling="2026-03-13 01:44:05.057670818 +0000 UTC m=+1505.865791778" observedRunningTime="2026-03-13 01:44:05.969268354 +0000 UTC m=+1506.777389314" watchObservedRunningTime="2026-03-13 01:44:05.977932159 +0000 UTC m=+1506.786053119" Mar 13 01:44:06.621793 master-0 kubenswrapper[19170]: I0313 01:44:06.621739 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:44:06.921968 master-0 kubenswrapper[19170]: I0313 01:44:06.921905 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9844d28-b0df-4563-a746-76fde502f19a","Type":"ContainerStarted","Data":"457f1a43de4ea348a7aa7ead6c14fa234cefc27e0071dda81f902416e3163f34"} Mar 13 01:44:06.923321 master-0 kubenswrapper[19170]: I0313 01:44:06.923278 19170 generic.go:334] "Generic (PLEG): container finished" podID="8290d408-116c-4ac7-9b8b-9f8e50f4355d" containerID="53ab5c7f7fcdb440c384858cb0002ee70fd29cf0568599938a9ca48b2a09a266" exitCode=0 Mar 13 01:44:06.923397 master-0 kubenswrapper[19170]: I0313 01:44:06.923343 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9j9l" event={"ID":"8290d408-116c-4ac7-9b8b-9f8e50f4355d","Type":"ContainerDied","Data":"53ab5c7f7fcdb440c384858cb0002ee70fd29cf0568599938a9ca48b2a09a266"} Mar 13 01:44:06.926408 master-0 kubenswrapper[19170]: I0313 01:44:06.925669 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0707bd16-09db-4d83-af8a-f8e7b78fad40","Type":"ContainerStarted","Data":"bb6bffbc30d351de963311b4228c6f118e59823900a8efd9445d02d2fe1a59f0"} Mar 13 01:44:06.948079 master-0 kubenswrapper[19170]: I0313 01:44:06.947759 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:44:07.100690 master-0 kubenswrapper[19170]: I0313 01:44:07.099220 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:44:07.100690 master-0 kubenswrapper[19170]: I0313 01:44:07.099436 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="dnsmasq-dns" containerID="cri-o://781c7ff54b544471e5d81d4ca9de014fcc8215d43fe4bb0e2ec91e6f4dc241bc" gracePeriod=10 Mar 13 01:44:07.942217 master-0 kubenswrapper[19170]: I0313 01:44:07.942129 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9j9l" event={"ID":"8290d408-116c-4ac7-9b8b-9f8e50f4355d","Type":"ContainerStarted","Data":"5605f8c88a445d58bbc593dc2a8e600e636a3cdd320107c4a552d58f9846f184"} Mar 13 01:44:07.942217 master-0 kubenswrapper[19170]: I0313 01:44:07.942199 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-p9j9l" event={"ID":"8290d408-116c-4ac7-9b8b-9f8e50f4355d","Type":"ContainerStarted","Data":"dd4cb760a6d1a34caf2fdc64f85f455473d6d08478a6be863648fc8e7ad385ed"} Mar 13 01:44:07.942217 master-0 kubenswrapper[19170]: I0313 01:44:07.942220 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:44:07.942217 master-0 kubenswrapper[19170]: I0313 01:44:07.942233 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:44:07.965915 master-0 kubenswrapper[19170]: I0313 01:44:07.965844 19170 generic.go:334] "Generic (PLEG): container finished" podID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerID="781c7ff54b544471e5d81d4ca9de014fcc8215d43fe4bb0e2ec91e6f4dc241bc" exitCode=0 Mar 13 01:44:07.966234 master-0 kubenswrapper[19170]: I0313 01:44:07.966172 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" event={"ID":"17b50b1c-b9dc-4656-a138-f68b9985c8b4","Type":"ContainerDied","Data":"781c7ff54b544471e5d81d4ca9de014fcc8215d43fe4bb0e2ec91e6f4dc241bc"} Mar 13 01:44:08.140087 master-0 kubenswrapper[19170]: I0313 01:44:08.140001 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-p9j9l" podStartSLOduration=11.26110818 podStartE2EDuration="17.139981211s" podCreationTimestamp="2026-03-13 01:43:51 +0000 UTC" firstStartedPulling="2026-03-13 01:43:59.226041059 +0000 UTC m=+1500.034162019" lastFinishedPulling="2026-03-13 01:44:05.10491409 +0000 UTC m=+1505.913035050" observedRunningTime="2026-03-13 01:44:08.130004029 +0000 UTC m=+1508.938124999" watchObservedRunningTime="2026-03-13 01:44:08.139981211 +0000 UTC m=+1508.948102171" Mar 13 01:44:08.688772 master-0 kubenswrapper[19170]: I0313 01:44:08.688692 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:08.689331 master-0 kubenswrapper[19170]: E0313 01:44:08.689301 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e09f95aa-60a6-467e-92a3-e9714796ed46" containerName="init" Mar 13 01:44:08.689331 master-0 kubenswrapper[19170]: I0313 01:44:08.689324 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e09f95aa-60a6-467e-92a3-e9714796ed46" containerName="init" Mar 13 01:44:08.689444 master-0 kubenswrapper[19170]: E0313 01:44:08.689354 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c936ebc7-422c-4dc5-8048-4bb275739793" containerName="init" Mar 13 01:44:08.689444 master-0 kubenswrapper[19170]: I0313 01:44:08.689362 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c936ebc7-422c-4dc5-8048-4bb275739793" containerName="init" Mar 13 01:44:08.689670 master-0 kubenswrapper[19170]: I0313 01:44:08.689616 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="c936ebc7-422c-4dc5-8048-4bb275739793" containerName="init" Mar 13 01:44:08.689670 master-0 kubenswrapper[19170]: I0313 01:44:08.689659 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e09f95aa-60a6-467e-92a3-e9714796ed46" containerName="init" Mar 13 01:44:08.691177 master-0 kubenswrapper[19170]: I0313 01:44:08.691151 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.787821 master-0 kubenswrapper[19170]: I0313 01:44:08.784938 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh727\" (UniqueName: \"kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.787821 master-0 kubenswrapper[19170]: I0313 01:44:08.785048 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.787821 master-0 kubenswrapper[19170]: I0313 01:44:08.785112 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.803518 master-0 kubenswrapper[19170]: I0313 01:44:08.803451 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:08.888359 master-0 kubenswrapper[19170]: I0313 01:44:08.887363 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.888585 master-0 kubenswrapper[19170]: I0313 01:44:08.888463 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.888585 master-0 kubenswrapper[19170]: I0313 01:44:08.888514 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.888763 master-0 kubenswrapper[19170]: I0313 01:44:08.888652 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh727\" (UniqueName: \"kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.891903 master-0 kubenswrapper[19170]: I0313 01:44:08.890504 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:08.917806 master-0 kubenswrapper[19170]: I0313 01:44:08.914218 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh727\" (UniqueName: \"kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727\") pod \"dnsmasq-dns-7bb8ffc699-rzlmb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:09.023756 master-0 kubenswrapper[19170]: I0313 01:44:09.023600 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:10.388390 master-0 kubenswrapper[19170]: I0313 01:44:10.388313 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:44:10.427969 master-0 kubenswrapper[19170]: I0313 01:44:10.427804 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config\") pod \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " Mar 13 01:44:10.428260 master-0 kubenswrapper[19170]: I0313 01:44:10.428190 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc\") pod \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " Mar 13 01:44:10.431344 master-0 kubenswrapper[19170]: I0313 01:44:10.428398 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f62qr\" (UniqueName: \"kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr\") pod \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\" (UID: \"17b50b1c-b9dc-4656-a138-f68b9985c8b4\") " Mar 13 01:44:10.432437 master-0 kubenswrapper[19170]: I0313 01:44:10.432350 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr" (OuterVolumeSpecName: "kube-api-access-f62qr") pod "17b50b1c-b9dc-4656-a138-f68b9985c8b4" (UID: "17b50b1c-b9dc-4656-a138-f68b9985c8b4"). InnerVolumeSpecName "kube-api-access-f62qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:10.469878 master-0 kubenswrapper[19170]: I0313 01:44:10.469804 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17b50b1c-b9dc-4656-a138-f68b9985c8b4" (UID: "17b50b1c-b9dc-4656-a138-f68b9985c8b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:10.474285 master-0 kubenswrapper[19170]: I0313 01:44:10.474202 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config" (OuterVolumeSpecName: "config") pod "17b50b1c-b9dc-4656-a138-f68b9985c8b4" (UID: "17b50b1c-b9dc-4656-a138-f68b9985c8b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:10.533606 master-0 kubenswrapper[19170]: I0313 01:44:10.533550 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:10.533606 master-0 kubenswrapper[19170]: I0313 01:44:10.533589 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b50b1c-b9dc-4656-a138-f68b9985c8b4-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:10.533606 master-0 kubenswrapper[19170]: I0313 01:44:10.533600 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f62qr\" (UniqueName: \"kubernetes.io/projected/17b50b1c-b9dc-4656-a138-f68b9985c8b4-kube-api-access-f62qr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:10.807170 master-0 kubenswrapper[19170]: I0313 01:44:10.807108 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 01:44:10.807554 master-0 kubenswrapper[19170]: E0313 01:44:10.807519 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="init" Mar 13 01:44:10.807554 master-0 kubenswrapper[19170]: I0313 01:44:10.807539 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="init" Mar 13 01:44:10.807783 master-0 kubenswrapper[19170]: E0313 01:44:10.807579 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="dnsmasq-dns" Mar 13 01:44:10.807783 master-0 kubenswrapper[19170]: I0313 01:44:10.807589 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="dnsmasq-dns" Mar 13 01:44:10.807921 master-0 kubenswrapper[19170]: I0313 01:44:10.807811 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" containerName="dnsmasq-dns" Mar 13 01:44:10.813094 master-0 kubenswrapper[19170]: I0313 01:44:10.813045 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 01:44:10.816835 master-0 kubenswrapper[19170]: I0313 01:44:10.816419 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 01:44:10.816835 master-0 kubenswrapper[19170]: I0313 01:44:10.816549 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 01:44:10.818230 master-0 kubenswrapper[19170]: I0313 01:44:10.817438 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 01:44:10.962854 master-0 kubenswrapper[19170]: I0313 01:44:10.962812 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 01:44:11.012141 master-0 kubenswrapper[19170]: I0313 01:44:11.012099 19170 generic.go:334] "Generic (PLEG): container finished" podID="4715cc17-276f-413c-a0e1-98b150ce558a" containerID="69849c65855087b0288027a28805264109b2098057cc427e1730d599acdd3076" exitCode=0 Mar 13 01:44:11.012417 master-0 kubenswrapper[19170]: I0313 01:44:11.012170 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4715cc17-276f-413c-a0e1-98b150ce558a","Type":"ContainerDied","Data":"69849c65855087b0288027a28805264109b2098057cc427e1730d599acdd3076"} Mar 13 01:44:11.014320 master-0 kubenswrapper[19170]: I0313 01:44:11.014290 19170 generic.go:334] "Generic (PLEG): container finished" podID="649a1497-ac21-43c0-bfdc-c997ee7c8a81" containerID="4c0111703c151c36c74000c3b6b5b1d57f6d860d2889c1eeafd83fa91c100684" exitCode=0 Mar 13 01:44:11.014385 master-0 kubenswrapper[19170]: I0313 01:44:11.014314 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"649a1497-ac21-43c0-bfdc-c997ee7c8a81","Type":"ContainerDied","Data":"4c0111703c151c36c74000c3b6b5b1d57f6d860d2889c1eeafd83fa91c100684"} Mar 13 01:44:11.017137 master-0 kubenswrapper[19170]: I0313 01:44:11.017101 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" event={"ID":"17b50b1c-b9dc-4656-a138-f68b9985c8b4","Type":"ContainerDied","Data":"07872e9b4c0f2a5cf27dcb0fe5d3c9b7116234f3d7024c9accc6286e69d2a269"} Mar 13 01:44:11.017137 master-0 kubenswrapper[19170]: I0313 01:44:11.017134 19170 scope.go:117] "RemoveContainer" containerID="781c7ff54b544471e5d81d4ca9de014fcc8215d43fe4bb0e2ec91e6f4dc241bc" Mar 13 01:44:11.017252 master-0 kubenswrapper[19170]: I0313 01:44:11.017233 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-mnb4b" Mar 13 01:44:11.050378 master-0 kubenswrapper[19170]: I0313 01:44:11.050330 19170 scope.go:117] "RemoveContainer" containerID="9f8bc1af3d40418fe8dbec896d01e1a9253cf92879349f4f21f0ecfbc997041f" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266183 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266379 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2370cdba-b395-4cba-9f33-1bc8b6628290\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5962ec6f-ad79-4989-9ba0-258fd50d7f97\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266431 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-cache\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266448 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sqx\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-kube-api-access-87sqx\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266505 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d54c519-9366-45f7-86e2-ebcdd8bb2477-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.266935 master-0 kubenswrapper[19170]: I0313 01:44:11.266521 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-lock\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.336464 master-0 kubenswrapper[19170]: I0313 01:44:11.336186 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:44:11.347277 master-0 kubenswrapper[19170]: I0313 01:44:11.347218 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-mnb4b"] Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371439 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2370cdba-b395-4cba-9f33-1bc8b6628290\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5962ec6f-ad79-4989-9ba0-258fd50d7f97\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371506 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-cache\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371529 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sqx\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-kube-api-access-87sqx\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371588 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d54c519-9366-45f7-86e2-ebcdd8bb2477-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371607 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-lock\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.371661 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: E0313 01:44:11.371864 19170 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: E0313 01:44:11.371885 19170 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: E0313 01:44:11.371940 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift podName:6d54c519-9366-45f7-86e2-ebcdd8bb2477 nodeName:}" failed. No retries permitted until 2026-03-13 01:44:11.871919191 +0000 UTC m=+1512.680040151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift") pod "swift-storage-0" (UID: "6d54c519-9366-45f7-86e2-ebcdd8bb2477") : configmap "swift-ring-files" not found Mar 13 01:44:11.372892 master-0 kubenswrapper[19170]: I0313 01:44:11.372738 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-lock\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.376762 master-0 kubenswrapper[19170]: I0313 01:44:11.375116 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:44:11.376762 master-0 kubenswrapper[19170]: I0313 01:44:11.375143 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2370cdba-b395-4cba-9f33-1bc8b6628290\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5962ec6f-ad79-4989-9ba0-258fd50d7f97\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/dd9e387c423dbbcee50ced6120de20f78eb0da6fdef0526e45be22d0a67c8ffe/globalmount\"" pod="openstack/swift-storage-0" Mar 13 01:44:11.376762 master-0 kubenswrapper[19170]: I0313 01:44:11.375267 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6d54c519-9366-45f7-86e2-ebcdd8bb2477-cache\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.377855 master-0 kubenswrapper[19170]: I0313 01:44:11.377811 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d54c519-9366-45f7-86e2-ebcdd8bb2477-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.440752 master-0 kubenswrapper[19170]: I0313 01:44:11.436813 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b50b1c-b9dc-4656-a138-f68b9985c8b4" path="/var/lib/kubelet/pods/17b50b1c-b9dc-4656-a138-f68b9985c8b4/volumes" Mar 13 01:44:11.712081 master-0 kubenswrapper[19170]: I0313 01:44:11.711826 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sqx\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-kube-api-access-87sqx\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.748868 master-0 kubenswrapper[19170]: I0313 01:44:11.748792 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:11.892804 master-0 kubenswrapper[19170]: I0313 01:44:11.892748 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:11.893722 master-0 kubenswrapper[19170]: E0313 01:44:11.893679 19170 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 01:44:11.893722 master-0 kubenswrapper[19170]: E0313 01:44:11.893719 19170 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 01:44:11.893818 master-0 kubenswrapper[19170]: E0313 01:44:11.893770 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift podName:6d54c519-9366-45f7-86e2-ebcdd8bb2477 nodeName:}" failed. No retries permitted until 2026-03-13 01:44:12.89374983 +0000 UTC m=+1513.701870880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift") pod "swift-storage-0" (UID: "6d54c519-9366-45f7-86e2-ebcdd8bb2477") : configmap "swift-ring-files" not found Mar 13 01:44:12.033413 master-0 kubenswrapper[19170]: I0313 01:44:12.033055 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4715cc17-276f-413c-a0e1-98b150ce558a","Type":"ContainerStarted","Data":"2a0a7b5fe1ec301eeed27fc7e0606a746d73cb0ee589471a911c312df101a5bb"} Mar 13 01:44:12.041804 master-0 kubenswrapper[19170]: I0313 01:44:12.041706 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d1b4c1ba-50fb-4b80-82bc-d815b452c580","Type":"ContainerStarted","Data":"5fde656bca03136184412c88e46b6fd4cfbedd6141d551ea2c600e55984cc902"} Mar 13 01:44:12.046290 master-0 kubenswrapper[19170]: I0313 01:44:12.046253 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"649a1497-ac21-43c0-bfdc-c997ee7c8a81","Type":"ContainerStarted","Data":"807377ed6942794fdea70b831f6a63cd636aa9d56a52e80f14e8becef1172710"} Mar 13 01:44:12.051051 master-0 kubenswrapper[19170]: I0313 01:44:12.049151 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" event={"ID":"4cf27610-c39b-47d8-ae28-e5691381bfbb","Type":"ContainerStarted","Data":"29bde0b69cc84bba57145017942e17a22fc075998737e55c27fd2b2decf5433f"} Mar 13 01:44:12.057938 master-0 kubenswrapper[19170]: I0313 01:44:12.057858 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.035436152 podStartE2EDuration="30.057840915s" podCreationTimestamp="2026-03-13 01:43:42 +0000 UTC" firstStartedPulling="2026-03-13 01:43:58.04748221 +0000 UTC m=+1498.855603160" lastFinishedPulling="2026-03-13 01:44:05.069886963 +0000 UTC m=+1505.878007923" observedRunningTime="2026-03-13 01:44:12.056405035 +0000 UTC m=+1512.864525995" watchObservedRunningTime="2026-03-13 01:44:12.057840915 +0000 UTC m=+1512.865961875" Mar 13 01:44:12.095057 master-0 kubenswrapper[19170]: I0313 01:44:12.094981 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.587122857 podStartE2EDuration="19.094961621s" podCreationTimestamp="2026-03-13 01:43:53 +0000 UTC" firstStartedPulling="2026-03-13 01:43:59.222310614 +0000 UTC m=+1500.030431574" lastFinishedPulling="2026-03-13 01:44:11.730149378 +0000 UTC m=+1512.538270338" observedRunningTime="2026-03-13 01:44:12.081042599 +0000 UTC m=+1512.889163579" watchObservedRunningTime="2026-03-13 01:44:12.094961621 +0000 UTC m=+1512.903082581" Mar 13 01:44:12.152661 master-0 kubenswrapper[19170]: I0313 01:44:12.149822 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.536421291 podStartE2EDuration="22.149803397s" podCreationTimestamp="2026-03-13 01:43:50 +0000 UTC" firstStartedPulling="2026-03-13 01:43:58.140981995 +0000 UTC m=+1498.949102955" lastFinishedPulling="2026-03-13 01:44:11.754364111 +0000 UTC m=+1512.562485061" observedRunningTime="2026-03-13 01:44:12.102798692 +0000 UTC m=+1512.910919652" watchObservedRunningTime="2026-03-13 01:44:12.149803397 +0000 UTC m=+1512.957924357" Mar 13 01:44:12.165041 master-0 kubenswrapper[19170]: I0313 01:44:12.162358 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.001755733 podStartE2EDuration="29.162338441s" podCreationTimestamp="2026-03-13 01:43:43 +0000 UTC" firstStartedPulling="2026-03-13 01:43:57.93929335 +0000 UTC m=+1498.747414310" lastFinishedPulling="2026-03-13 01:44:05.099876068 +0000 UTC m=+1505.907997018" observedRunningTime="2026-03-13 01:44:12.148985354 +0000 UTC m=+1512.957106314" watchObservedRunningTime="2026-03-13 01:44:12.162338441 +0000 UTC m=+1512.970459401" Mar 13 01:44:12.667654 master-0 kubenswrapper[19170]: I0313 01:44:12.667573 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 01:44:12.668264 master-0 kubenswrapper[19170]: I0313 01:44:12.667777 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 01:44:12.728686 master-0 kubenswrapper[19170]: I0313 01:44:12.727775 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 01:44:12.843677 master-0 kubenswrapper[19170]: I0313 01:44:12.843613 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2370cdba-b395-4cba-9f33-1bc8b6628290\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5962ec6f-ad79-4989-9ba0-258fd50d7f97\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:12.919778 master-0 kubenswrapper[19170]: I0313 01:44:12.919662 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:12.919944 master-0 kubenswrapper[19170]: E0313 01:44:12.919917 19170 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 01:44:12.919944 master-0 kubenswrapper[19170]: E0313 01:44:12.919934 19170 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 01:44:12.920021 master-0 kubenswrapper[19170]: E0313 01:44:12.919977 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift podName:6d54c519-9366-45f7-86e2-ebcdd8bb2477 nodeName:}" failed. No retries permitted until 2026-03-13 01:44:14.919961056 +0000 UTC m=+1515.728082016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift") pod "swift-storage-0" (UID: "6d54c519-9366-45f7-86e2-ebcdd8bb2477") : configmap "swift-ring-files" not found Mar 13 01:44:13.060618 master-0 kubenswrapper[19170]: I0313 01:44:13.060159 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"841fec7e-d35c-45d6-8b12-5f5d51270e1b","Type":"ContainerStarted","Data":"09f1a2e96a1f5264f4240a94b1dd02ede278a9c7347d874f5196e93401f0ef7c"} Mar 13 01:44:13.061816 master-0 kubenswrapper[19170]: I0313 01:44:13.061769 19170 generic.go:334] "Generic (PLEG): container finished" podID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerID="4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c" exitCode=0 Mar 13 01:44:13.061962 master-0 kubenswrapper[19170]: I0313 01:44:13.061909 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" event={"ID":"4cf27610-c39b-47d8-ae28-e5691381bfbb","Type":"ContainerDied","Data":"4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c"} Mar 13 01:44:13.125219 master-0 kubenswrapper[19170]: I0313 01:44:13.119570 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 01:44:13.202750 master-0 kubenswrapper[19170]: I0313 01:44:13.202690 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-n9bgl"] Mar 13 01:44:13.211420 master-0 kubenswrapper[19170]: I0313 01:44:13.204110 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.211420 master-0 kubenswrapper[19170]: I0313 01:44:13.207353 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 01:44:13.211420 master-0 kubenswrapper[19170]: I0313 01:44:13.207522 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 01:44:13.211420 master-0 kubenswrapper[19170]: I0313 01:44:13.207747 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 01:44:13.218064 master-0 kubenswrapper[19170]: I0313 01:44:13.218013 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n9bgl"] Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.237850 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.237910 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pw5\" (UniqueName: \"kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.237955 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.237993 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.238013 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.238038 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.238078 master-0 kubenswrapper[19170]: I0313 01:44:13.238065 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.338787 master-0 kubenswrapper[19170]: I0313 01:44:13.338671 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.338787 master-0 kubenswrapper[19170]: I0313 01:44:13.338732 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pw5\" (UniqueName: \"kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.338787 master-0 kubenswrapper[19170]: I0313 01:44:13.338780 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.339087 master-0 kubenswrapper[19170]: I0313 01:44:13.338823 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.339087 master-0 kubenswrapper[19170]: I0313 01:44:13.338841 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.339087 master-0 kubenswrapper[19170]: I0313 01:44:13.338864 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.339087 master-0 kubenswrapper[19170]: I0313 01:44:13.338892 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.348270 master-0 kubenswrapper[19170]: I0313 01:44:13.343938 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.348270 master-0 kubenswrapper[19170]: I0313 01:44:13.344428 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.348270 master-0 kubenswrapper[19170]: I0313 01:44:13.345745 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.348270 master-0 kubenswrapper[19170]: I0313 01:44:13.346213 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.348270 master-0 kubenswrapper[19170]: I0313 01:44:13.347251 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.363655 master-0 kubenswrapper[19170]: I0313 01:44:13.359799 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.381656 master-0 kubenswrapper[19170]: I0313 01:44:13.378197 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pw5\" (UniqueName: \"kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5\") pod \"swift-ring-rebalance-n9bgl\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.387315 master-0 kubenswrapper[19170]: I0313 01:44:13.387273 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:13.422001 master-0 kubenswrapper[19170]: I0313 01:44:13.420007 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:13.424559 master-0 kubenswrapper[19170]: I0313 01:44:13.424528 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.434511 master-0 kubenswrapper[19170]: I0313 01:44:13.434010 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 01:44:13.443200 master-0 kubenswrapper[19170]: I0313 01:44:13.443130 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.443421 master-0 kubenswrapper[19170]: I0313 01:44:13.443248 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwjtr\" (UniqueName: \"kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.443421 master-0 kubenswrapper[19170]: I0313 01:44:13.443310 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.443421 master-0 kubenswrapper[19170]: I0313 01:44:13.443352 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.460288 master-0 kubenswrapper[19170]: I0313 01:44:13.460156 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:13.486332 master-0 kubenswrapper[19170]: I0313 01:44:13.483266 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4tvwx"] Mar 13 01:44:13.486332 master-0 kubenswrapper[19170]: I0313 01:44:13.484573 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.486332 master-0 kubenswrapper[19170]: I0313 01:44:13.486156 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 01:44:13.549983 master-0 kubenswrapper[19170]: I0313 01:44:13.546753 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.549983 master-0 kubenswrapper[19170]: I0313 01:44:13.546874 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwjtr\" (UniqueName: \"kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.549983 master-0 kubenswrapper[19170]: I0313 01:44:13.546961 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.549983 master-0 kubenswrapper[19170]: I0313 01:44:13.546994 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.549983 master-0 kubenswrapper[19170]: I0313 01:44:13.548251 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.565652 master-0 kubenswrapper[19170]: I0313 01:44:13.550434 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.565652 master-0 kubenswrapper[19170]: I0313 01:44:13.550971 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.565652 master-0 kubenswrapper[19170]: I0313 01:44:13.556685 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4tvwx"] Mar 13 01:44:13.572437 master-0 kubenswrapper[19170]: I0313 01:44:13.570662 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:13.589797 master-0 kubenswrapper[19170]: I0313 01:44:13.580427 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwjtr\" (UniqueName: \"kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr\") pod \"dnsmasq-dns-85859d6ccf-kpj6t\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.652827 master-0 kubenswrapper[19170]: I0313 01:44:13.652730 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx9rr\" (UniqueName: \"kubernetes.io/projected/53c4105f-634f-4ddc-8b80-34df87fad4ec-kube-api-access-jx9rr\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.652827 master-0 kubenswrapper[19170]: I0313 01:44:13.652799 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c4105f-634f-4ddc-8b80-34df87fad4ec-config\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.653048 master-0 kubenswrapper[19170]: I0313 01:44:13.652926 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-combined-ca-bundle\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.653048 master-0 kubenswrapper[19170]: I0313 01:44:13.652956 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovn-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.653048 master-0 kubenswrapper[19170]: I0313 01:44:13.653010 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovs-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.653048 master-0 kubenswrapper[19170]: I0313 01:44:13.653030 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.662913 master-0 kubenswrapper[19170]: I0313 01:44:13.659501 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:13.664474 master-0 kubenswrapper[19170]: I0313 01:44:13.664432 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:13.729748 master-0 kubenswrapper[19170]: I0313 01:44:13.724736 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:13.729748 master-0 kubenswrapper[19170]: I0313 01:44:13.728114 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.742349 master-0 kubenswrapper[19170]: I0313 01:44:13.742288 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.750485 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.761874 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.761919 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.761937 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6x8\" (UniqueName: \"kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.761986 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762009 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762031 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx9rr\" (UniqueName: \"kubernetes.io/projected/53c4105f-634f-4ddc-8b80-34df87fad4ec-kube-api-access-jx9rr\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762048 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c4105f-634f-4ddc-8b80-34df87fad4ec-config\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762111 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-combined-ca-bundle\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762134 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovn-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762196 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovs-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.762222 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.770997 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovs-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.771494 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c4105f-634f-4ddc-8b80-34df87fad4ec-config\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.771570 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/53c4105f-634f-4ddc-8b80-34df87fad4ec-ovn-rundir\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.774857 master-0 kubenswrapper[19170]: I0313 01:44:13.772753 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.782878 master-0 kubenswrapper[19170]: I0313 01:44:13.780468 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c4105f-634f-4ddc-8b80-34df87fad4ec-combined-ca-bundle\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.797804 master-0 kubenswrapper[19170]: I0313 01:44:13.797011 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx9rr\" (UniqueName: \"kubernetes.io/projected/53c4105f-634f-4ddc-8b80-34df87fad4ec-kube-api-access-jx9rr\") pod \"ovn-controller-metrics-4tvwx\" (UID: \"53c4105f-634f-4ddc-8b80-34df87fad4ec\") " pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.835848 master-0 kubenswrapper[19170]: I0313 01:44:13.835524 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4tvwx" Mar 13 01:44:13.863658 master-0 kubenswrapper[19170]: I0313 01:44:13.863586 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.863839 master-0 kubenswrapper[19170]: I0313 01:44:13.863767 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.863839 master-0 kubenswrapper[19170]: I0313 01:44:13.863791 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6x8\" (UniqueName: \"kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.863944 master-0 kubenswrapper[19170]: I0313 01:44:13.863836 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.863944 master-0 kubenswrapper[19170]: I0313 01:44:13.863861 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.865236 master-0 kubenswrapper[19170]: I0313 01:44:13.864827 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.865646 master-0 kubenswrapper[19170]: I0313 01:44:13.865581 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.866241 master-0 kubenswrapper[19170]: I0313 01:44:13.865722 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.866562 master-0 kubenswrapper[19170]: I0313 01:44:13.866545 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:13.883622 master-0 kubenswrapper[19170]: I0313 01:44:13.883385 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6x8\" (UniqueName: \"kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8\") pod \"dnsmasq-dns-5bf8b865dc-ljb9g\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:14.027442 master-0 kubenswrapper[19170]: I0313 01:44:14.027370 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 01:44:14.075418 master-0 kubenswrapper[19170]: I0313 01:44:14.073264 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:14.085338 master-0 kubenswrapper[19170]: I0313 01:44:14.081610 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" event={"ID":"4cf27610-c39b-47d8-ae28-e5691381bfbb","Type":"ContainerStarted","Data":"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3"} Mar 13 01:44:14.085338 master-0 kubenswrapper[19170]: I0313 01:44:14.081700 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:14.140563 master-0 kubenswrapper[19170]: I0313 01:44:14.138698 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-n9bgl"] Mar 13 01:44:14.140563 master-0 kubenswrapper[19170]: I0313 01:44:14.139760 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" podStartSLOduration=6.139738778 podStartE2EDuration="6.139738778s" podCreationTimestamp="2026-03-13 01:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:14.109323181 +0000 UTC m=+1514.917444141" watchObservedRunningTime="2026-03-13 01:44:14.139738778 +0000 UTC m=+1514.947859738" Mar 13 01:44:14.312072 master-0 kubenswrapper[19170]: W0313 01:44:14.312030 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bed0665_465b_42b5_8413_767afd28078c.slice/crio-23870274d3551ab7f1edcadb307e44934ca71bc6637254ccf18477e95972b271 WatchSource:0}: Error finding container 23870274d3551ab7f1edcadb307e44934ca71bc6637254ccf18477e95972b271: Status 404 returned error can't find the container with id 23870274d3551ab7f1edcadb307e44934ca71bc6637254ccf18477e95972b271 Mar 13 01:44:14.326497 master-0 kubenswrapper[19170]: I0313 01:44:14.326451 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:14.444567 master-0 kubenswrapper[19170]: I0313 01:44:14.444455 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4tvwx"] Mar 13 01:44:14.458151 master-0 kubenswrapper[19170]: W0313 01:44:14.458099 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53c4105f_634f_4ddc_8b80_34df87fad4ec.slice/crio-bee3530ad859ddf899b6a36c739f59f16587d47595035f92a1b26359cedea7b7 WatchSource:0}: Error finding container bee3530ad859ddf899b6a36c739f59f16587d47595035f92a1b26359cedea7b7: Status 404 returned error can't find the container with id bee3530ad859ddf899b6a36c739f59f16587d47595035f92a1b26359cedea7b7 Mar 13 01:44:14.702135 master-0 kubenswrapper[19170]: I0313 01:44:14.702059 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:14.735609 master-0 kubenswrapper[19170]: W0313 01:44:14.734785 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55e0a8e0_bf5b_4aaa_bff7_de32dc6b325b.slice/crio-b08d16da1d51cb153bf053222a1e4a880477c9c24197ddc6f6bc51a85aeb45d4 WatchSource:0}: Error finding container b08d16da1d51cb153bf053222a1e4a880477c9c24197ddc6f6bc51a85aeb45d4: Status 404 returned error can't find the container with id b08d16da1d51cb153bf053222a1e4a880477c9c24197ddc6f6bc51a85aeb45d4 Mar 13 01:44:14.990189 master-0 kubenswrapper[19170]: I0313 01:44:14.990128 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:14.990522 master-0 kubenswrapper[19170]: E0313 01:44:14.990470 19170 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 01:44:14.990578 master-0 kubenswrapper[19170]: E0313 01:44:14.990542 19170 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 01:44:14.990669 master-0 kubenswrapper[19170]: E0313 01:44:14.990624 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift podName:6d54c519-9366-45f7-86e2-ebcdd8bb2477 nodeName:}" failed. No retries permitted until 2026-03-13 01:44:18.990589181 +0000 UTC m=+1519.798710141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift") pod "swift-storage-0" (UID: "6d54c519-9366-45f7-86e2-ebcdd8bb2477") : configmap "swift-ring-files" not found Mar 13 01:44:15.029087 master-0 kubenswrapper[19170]: I0313 01:44:15.029016 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 01:44:15.073343 master-0 kubenswrapper[19170]: I0313 01:44:15.073228 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 01:44:15.126032 master-0 kubenswrapper[19170]: I0313 01:44:15.125916 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n9bgl" event={"ID":"291c0e40-0b9b-4bd3-b8d6-1983878467ae","Type":"ContainerStarted","Data":"3d9a15510e1f852b6520b9cb6cdd65c36bd69ecb1629fde588e31a15c5d91bf0"} Mar 13 01:44:15.133985 master-0 kubenswrapper[19170]: I0313 01:44:15.128325 19170 generic.go:334] "Generic (PLEG): container finished" podID="4bed0665-465b-42b5-8413-767afd28078c" containerID="ac78b142e1f2c1db893c7408745665754613314fbf564dfe978f4e4c3cd4ba48" exitCode=0 Mar 13 01:44:15.133985 master-0 kubenswrapper[19170]: I0313 01:44:15.128432 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" event={"ID":"4bed0665-465b-42b5-8413-767afd28078c","Type":"ContainerDied","Data":"ac78b142e1f2c1db893c7408745665754613314fbf564dfe978f4e4c3cd4ba48"} Mar 13 01:44:15.133985 master-0 kubenswrapper[19170]: I0313 01:44:15.128844 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" event={"ID":"4bed0665-465b-42b5-8413-767afd28078c","Type":"ContainerStarted","Data":"23870274d3551ab7f1edcadb307e44934ca71bc6637254ccf18477e95972b271"} Mar 13 01:44:15.133985 master-0 kubenswrapper[19170]: I0313 01:44:15.133938 19170 generic.go:334] "Generic (PLEG): container finished" podID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerID="02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394" exitCode=0 Mar 13 01:44:15.133985 master-0 kubenswrapper[19170]: I0313 01:44:15.133999 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" event={"ID":"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b","Type":"ContainerDied","Data":"02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394"} Mar 13 01:44:15.134848 master-0 kubenswrapper[19170]: I0313 01:44:15.134028 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" event={"ID":"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b","Type":"ContainerStarted","Data":"b08d16da1d51cb153bf053222a1e4a880477c9c24197ddc6f6bc51a85aeb45d4"} Mar 13 01:44:15.149154 master-0 kubenswrapper[19170]: I0313 01:44:15.149072 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4tvwx" event={"ID":"53c4105f-634f-4ddc-8b80-34df87fad4ec","Type":"ContainerStarted","Data":"056e918e5496c403bd87fcdd239be6d23ed6dcba3cb63cf2bdbc6dcb9b7ce4f7"} Mar 13 01:44:15.149154 master-0 kubenswrapper[19170]: I0313 01:44:15.149160 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4tvwx" event={"ID":"53c4105f-634f-4ddc-8b80-34df87fad4ec","Type":"ContainerStarted","Data":"bee3530ad859ddf899b6a36c739f59f16587d47595035f92a1b26359cedea7b7"} Mar 13 01:44:15.149399 master-0 kubenswrapper[19170]: I0313 01:44:15.149325 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="dnsmasq-dns" containerID="cri-o://b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3" gracePeriod=10 Mar 13 01:44:15.203091 master-0 kubenswrapper[19170]: I0313 01:44:15.203008 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4tvwx" podStartSLOduration=2.202983878 podStartE2EDuration="2.202983878s" podCreationTimestamp="2026-03-13 01:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:15.200418506 +0000 UTC m=+1516.008539466" watchObservedRunningTime="2026-03-13 01:44:15.202983878 +0000 UTC m=+1516.011104838" Mar 13 01:44:15.238092 master-0 kubenswrapper[19170]: I0313 01:44:15.238034 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 01:44:15.538797 master-0 kubenswrapper[19170]: I0313 01:44:15.531882 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 01:44:15.545308 master-0 kubenswrapper[19170]: I0313 01:44:15.544711 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 01:44:15.545308 master-0 kubenswrapper[19170]: I0313 01:44:15.544815 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 01:44:15.547077 master-0 kubenswrapper[19170]: I0313 01:44:15.546981 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 01:44:15.547527 master-0 kubenswrapper[19170]: I0313 01:44:15.547163 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 01:44:15.547527 master-0 kubenswrapper[19170]: I0313 01:44:15.547320 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622385 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622442 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf29z\" (UniqueName: \"kubernetes.io/projected/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-kube-api-access-zf29z\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622472 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622548 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622592 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622610 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-scripts\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.622876 master-0 kubenswrapper[19170]: I0313 01:44:15.622665 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-config\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.751753 master-0 kubenswrapper[19170]: I0313 01:44:15.751557 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-config\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.751753 master-0 kubenswrapper[19170]: I0313 01:44:15.751683 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.751753 master-0 kubenswrapper[19170]: I0313 01:44:15.751710 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf29z\" (UniqueName: \"kubernetes.io/projected/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-kube-api-access-zf29z\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.751753 master-0 kubenswrapper[19170]: I0313 01:44:15.751736 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.753317 master-0 kubenswrapper[19170]: I0313 01:44:15.751795 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.753317 master-0 kubenswrapper[19170]: I0313 01:44:15.751833 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-scripts\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.753317 master-0 kubenswrapper[19170]: I0313 01:44:15.751850 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.756981 master-0 kubenswrapper[19170]: I0313 01:44:15.755938 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.756981 master-0 kubenswrapper[19170]: I0313 01:44:15.756481 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-config\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.756981 master-0 kubenswrapper[19170]: I0313 01:44:15.756482 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-scripts\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.759143 master-0 kubenswrapper[19170]: I0313 01:44:15.758550 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.762174 master-0 kubenswrapper[19170]: I0313 01:44:15.762143 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.762312 master-0 kubenswrapper[19170]: I0313 01:44:15.762288 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.779459 master-0 kubenswrapper[19170]: I0313 01:44:15.779409 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf29z\" (UniqueName: \"kubernetes.io/projected/1a735da0-4dae-4c25-9dd0-a277c75fc1b8-kube-api-access-zf29z\") pod \"ovn-northd-0\" (UID: \"1a735da0-4dae-4c25-9dd0-a277c75fc1b8\") " pod="openstack/ovn-northd-0" Mar 13 01:44:15.917531 master-0 kubenswrapper[19170]: I0313 01:44:15.917491 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:15.926605 master-0 kubenswrapper[19170]: I0313 01:44:15.925434 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:15.949742 master-0 kubenswrapper[19170]: I0313 01:44:15.949366 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 01:44:16.072546 master-0 kubenswrapper[19170]: I0313 01:44:16.072493 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config\") pod \"4bed0665-465b-42b5-8413-767afd28078c\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " Mar 13 01:44:16.072873 master-0 kubenswrapper[19170]: I0313 01:44:16.072846 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh727\" (UniqueName: \"kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727\") pod \"4cf27610-c39b-47d8-ae28-e5691381bfbb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " Mar 13 01:44:16.072936 master-0 kubenswrapper[19170]: I0313 01:44:16.072889 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config\") pod \"4cf27610-c39b-47d8-ae28-e5691381bfbb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " Mar 13 01:44:16.074156 master-0 kubenswrapper[19170]: I0313 01:44:16.073989 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwjtr\" (UniqueName: \"kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr\") pod \"4bed0665-465b-42b5-8413-767afd28078c\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " Mar 13 01:44:16.074224 master-0 kubenswrapper[19170]: I0313 01:44:16.074175 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc\") pod \"4bed0665-465b-42b5-8413-767afd28078c\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " Mar 13 01:44:16.074266 master-0 kubenswrapper[19170]: I0313 01:44:16.074224 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb\") pod \"4bed0665-465b-42b5-8413-767afd28078c\" (UID: \"4bed0665-465b-42b5-8413-767afd28078c\") " Mar 13 01:44:16.074337 master-0 kubenswrapper[19170]: I0313 01:44:16.074318 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc\") pod \"4cf27610-c39b-47d8-ae28-e5691381bfbb\" (UID: \"4cf27610-c39b-47d8-ae28-e5691381bfbb\") " Mar 13 01:44:16.083567 master-0 kubenswrapper[19170]: I0313 01:44:16.081366 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr" (OuterVolumeSpecName: "kube-api-access-mwjtr") pod "4bed0665-465b-42b5-8413-767afd28078c" (UID: "4bed0665-465b-42b5-8413-767afd28078c"). InnerVolumeSpecName "kube-api-access-mwjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:16.083567 master-0 kubenswrapper[19170]: I0313 01:44:16.082523 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727" (OuterVolumeSpecName: "kube-api-access-rh727") pod "4cf27610-c39b-47d8-ae28-e5691381bfbb" (UID: "4cf27610-c39b-47d8-ae28-e5691381bfbb"). InnerVolumeSpecName "kube-api-access-rh727". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:16.101738 master-0 kubenswrapper[19170]: I0313 01:44:16.101637 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config" (OuterVolumeSpecName: "config") pod "4bed0665-465b-42b5-8413-767afd28078c" (UID: "4bed0665-465b-42b5-8413-767afd28078c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:16.111119 master-0 kubenswrapper[19170]: I0313 01:44:16.111065 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4bed0665-465b-42b5-8413-767afd28078c" (UID: "4bed0665-465b-42b5-8413-767afd28078c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:16.111768 master-0 kubenswrapper[19170]: I0313 01:44:16.111737 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4bed0665-465b-42b5-8413-767afd28078c" (UID: "4bed0665-465b-42b5-8413-767afd28078c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:16.120275 master-0 kubenswrapper[19170]: I0313 01:44:16.120221 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config" (OuterVolumeSpecName: "config") pod "4cf27610-c39b-47d8-ae28-e5691381bfbb" (UID: "4cf27610-c39b-47d8-ae28-e5691381bfbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:16.146096 master-0 kubenswrapper[19170]: I0313 01:44:16.145900 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cf27610-c39b-47d8-ae28-e5691381bfbb" (UID: "4cf27610-c39b-47d8-ae28-e5691381bfbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:16.164505 master-0 kubenswrapper[19170]: I0313 01:44:16.163922 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" Mar 13 01:44:16.164505 master-0 kubenswrapper[19170]: I0313 01:44:16.163976 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85859d6ccf-kpj6t" event={"ID":"4bed0665-465b-42b5-8413-767afd28078c","Type":"ContainerDied","Data":"23870274d3551ab7f1edcadb307e44934ca71bc6637254ccf18477e95972b271"} Mar 13 01:44:16.164505 master-0 kubenswrapper[19170]: I0313 01:44:16.164032 19170 scope.go:117] "RemoveContainer" containerID="ac78b142e1f2c1db893c7408745665754613314fbf564dfe978f4e4c3cd4ba48" Mar 13 01:44:16.167286 master-0 kubenswrapper[19170]: I0313 01:44:16.167245 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" event={"ID":"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b","Type":"ContainerStarted","Data":"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261"} Mar 13 01:44:16.168394 master-0 kubenswrapper[19170]: I0313 01:44:16.168329 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:16.170943 master-0 kubenswrapper[19170]: I0313 01:44:16.170919 19170 generic.go:334] "Generic (PLEG): container finished" podID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerID="b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3" exitCode=0 Mar 13 01:44:16.171713 master-0 kubenswrapper[19170]: I0313 01:44:16.171617 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" Mar 13 01:44:16.173033 master-0 kubenswrapper[19170]: I0313 01:44:16.173010 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" event={"ID":"4cf27610-c39b-47d8-ae28-e5691381bfbb","Type":"ContainerDied","Data":"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3"} Mar 13 01:44:16.173087 master-0 kubenswrapper[19170]: I0313 01:44:16.173038 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bb8ffc699-rzlmb" event={"ID":"4cf27610-c39b-47d8-ae28-e5691381bfbb","Type":"ContainerDied","Data":"29bde0b69cc84bba57145017942e17a22fc075998737e55c27fd2b2decf5433f"} Mar 13 01:44:16.176285 master-0 kubenswrapper[19170]: I0313 01:44:16.176221 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176285 master-0 kubenswrapper[19170]: I0313 01:44:16.176281 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh727\" (UniqueName: \"kubernetes.io/projected/4cf27610-c39b-47d8-ae28-e5691381bfbb-kube-api-access-rh727\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176438 master-0 kubenswrapper[19170]: I0313 01:44:16.176294 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176438 master-0 kubenswrapper[19170]: I0313 01:44:16.176303 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwjtr\" (UniqueName: \"kubernetes.io/projected/4bed0665-465b-42b5-8413-767afd28078c-kube-api-access-mwjtr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176438 master-0 kubenswrapper[19170]: I0313 01:44:16.176315 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176438 master-0 kubenswrapper[19170]: I0313 01:44:16.176323 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bed0665-465b-42b5-8413-767afd28078c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.176438 master-0 kubenswrapper[19170]: I0313 01:44:16.176332 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cf27610-c39b-47d8-ae28-e5691381bfbb-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:16.214780 master-0 kubenswrapper[19170]: I0313 01:44:16.212983 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" podStartSLOduration=3.212964677 podStartE2EDuration="3.212964677s" podCreationTimestamp="2026-03-13 01:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:16.196958646 +0000 UTC m=+1517.005079606" watchObservedRunningTime="2026-03-13 01:44:16.212964677 +0000 UTC m=+1517.021085627" Mar 13 01:44:16.293695 master-0 kubenswrapper[19170]: I0313 01:44:16.291127 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:16.312269 master-0 kubenswrapper[19170]: I0313 01:44:16.311689 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85859d6ccf-kpj6t"] Mar 13 01:44:16.326675 master-0 kubenswrapper[19170]: I0313 01:44:16.323411 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:16.333129 master-0 kubenswrapper[19170]: I0313 01:44:16.332734 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bb8ffc699-rzlmb"] Mar 13 01:44:16.380701 master-0 kubenswrapper[19170]: E0313 01:44:16.374749 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cf27610_c39b_47d8_ae28_e5691381bfbb.slice/crio-29bde0b69cc84bba57145017942e17a22fc075998737e55c27fd2b2decf5433f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bed0665_465b_42b5_8413_767afd28078c.slice\": RecentStats: unable to find data in memory cache]" Mar 13 01:44:17.454776 master-0 kubenswrapper[19170]: I0313 01:44:17.450031 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bed0665-465b-42b5-8413-767afd28078c" path="/var/lib/kubelet/pods/4bed0665-465b-42b5-8413-767afd28078c/volumes" Mar 13 01:44:17.454776 master-0 kubenswrapper[19170]: I0313 01:44:17.451350 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" path="/var/lib/kubelet/pods/4cf27610-c39b-47d8-ae28-e5691381bfbb/volumes" Mar 13 01:44:18.434686 master-0 kubenswrapper[19170]: I0313 01:44:18.434626 19170 scope.go:117] "RemoveContainer" containerID="b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3" Mar 13 01:44:18.499656 master-0 kubenswrapper[19170]: I0313 01:44:18.499594 19170 scope.go:117] "RemoveContainer" containerID="4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c" Mar 13 01:44:18.615834 master-0 kubenswrapper[19170]: I0313 01:44:18.615594 19170 scope.go:117] "RemoveContainer" containerID="b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3" Mar 13 01:44:18.616291 master-0 kubenswrapper[19170]: E0313 01:44:18.616200 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3\": container with ID starting with b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3 not found: ID does not exist" containerID="b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3" Mar 13 01:44:18.616291 master-0 kubenswrapper[19170]: I0313 01:44:18.616259 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3"} err="failed to get container status \"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3\": rpc error: code = NotFound desc = could not find container \"b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3\": container with ID starting with b5a78775a7f82b9c74a2b0ee4e2939a24a224762a55a7ca9fc494d9ad03c04a3 not found: ID does not exist" Mar 13 01:44:18.616291 master-0 kubenswrapper[19170]: I0313 01:44:18.616295 19170 scope.go:117] "RemoveContainer" containerID="4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c" Mar 13 01:44:18.616740 master-0 kubenswrapper[19170]: E0313 01:44:18.616671 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c\": container with ID starting with 4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c not found: ID does not exist" containerID="4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c" Mar 13 01:44:18.616740 master-0 kubenswrapper[19170]: I0313 01:44:18.616721 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c"} err="failed to get container status \"4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c\": rpc error: code = NotFound desc = could not find container \"4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c\": container with ID starting with 4b7584e1266442efe1db5b5657210c80f3acf66fcbace0cab419b11afff14c9c not found: ID does not exist" Mar 13 01:44:18.975214 master-0 kubenswrapper[19170]: I0313 01:44:18.975171 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 01:44:19.056285 master-0 kubenswrapper[19170]: I0313 01:44:19.056214 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:19.056525 master-0 kubenswrapper[19170]: E0313 01:44:19.056468 19170 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 01:44:19.056525 master-0 kubenswrapper[19170]: E0313 01:44:19.056520 19170 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 01:44:19.056624 master-0 kubenswrapper[19170]: E0313 01:44:19.056602 19170 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift podName:6d54c519-9366-45f7-86e2-ebcdd8bb2477 nodeName:}" failed. No retries permitted until 2026-03-13 01:44:27.056575241 +0000 UTC m=+1527.864696191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift") pod "swift-storage-0" (UID: "6d54c519-9366-45f7-86e2-ebcdd8bb2477") : configmap "swift-ring-files" not found Mar 13 01:44:19.210070 master-0 kubenswrapper[19170]: I0313 01:44:19.209943 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1a735da0-4dae-4c25-9dd0-a277c75fc1b8","Type":"ContainerStarted","Data":"d2d1bd6f414624ecced022543abd87e85cc62de8f0e94a7641ce0dcfa699ec32"} Mar 13 01:44:19.212984 master-0 kubenswrapper[19170]: I0313 01:44:19.212949 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n9bgl" event={"ID":"291c0e40-0b9b-4bd3-b8d6-1983878467ae","Type":"ContainerStarted","Data":"6f32b947e19f480059a85cd7f15b5eab2ba57f4183ed5db626e686a7344bc716"} Mar 13 01:44:19.242020 master-0 kubenswrapper[19170]: I0313 01:44:19.241935 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-n9bgl" podStartSLOduration=1.877428611 podStartE2EDuration="6.241914165s" podCreationTimestamp="2026-03-13 01:44:13 +0000 UTC" firstStartedPulling="2026-03-13 01:44:14.142346651 +0000 UTC m=+1514.950467611" lastFinishedPulling="2026-03-13 01:44:18.506832165 +0000 UTC m=+1519.314953165" observedRunningTime="2026-03-13 01:44:19.236973806 +0000 UTC m=+1520.045094776" watchObservedRunningTime="2026-03-13 01:44:19.241914165 +0000 UTC m=+1520.050035135" Mar 13 01:44:19.828716 master-0 kubenswrapper[19170]: I0313 01:44:19.828599 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 01:44:19.828716 master-0 kubenswrapper[19170]: I0313 01:44:19.828715 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 01:44:19.958609 master-0 kubenswrapper[19170]: I0313 01:44:19.958188 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 01:44:20.307937 master-0 kubenswrapper[19170]: I0313 01:44:20.307894 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 01:44:20.800616 master-0 kubenswrapper[19170]: I0313 01:44:20.800472 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 01:44:20.800616 master-0 kubenswrapper[19170]: I0313 01:44:20.800534 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 01:44:20.902667 master-0 kubenswrapper[19170]: I0313 01:44:20.902575 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 01:44:21.247699 master-0 kubenswrapper[19170]: I0313 01:44:21.247660 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1a735da0-4dae-4c25-9dd0-a277c75fc1b8","Type":"ContainerStarted","Data":"d86df209dc0b5fcd2b043633d9ab4404a56de141abb7c9df16c0861087869eb7"} Mar 13 01:44:21.247989 master-0 kubenswrapper[19170]: I0313 01:44:21.247969 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1a735da0-4dae-4c25-9dd0-a277c75fc1b8","Type":"ContainerStarted","Data":"5355b3724fc1c7c3e260cac1ea1620d7a38be73486781ae635a6e30b5123f177"} Mar 13 01:44:21.248101 master-0 kubenswrapper[19170]: I0313 01:44:21.248087 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 01:44:21.273283 master-0 kubenswrapper[19170]: I0313 01:44:21.273201 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.111805545 podStartE2EDuration="6.273175041s" podCreationTimestamp="2026-03-13 01:44:15 +0000 UTC" firstStartedPulling="2026-03-13 01:44:18.975844926 +0000 UTC m=+1519.783965886" lastFinishedPulling="2026-03-13 01:44:20.137214422 +0000 UTC m=+1520.945335382" observedRunningTime="2026-03-13 01:44:21.269701873 +0000 UTC m=+1522.077822863" watchObservedRunningTime="2026-03-13 01:44:21.273175041 +0000 UTC m=+1522.081296001" Mar 13 01:44:21.330231 master-0 kubenswrapper[19170]: I0313 01:44:21.330160 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 01:44:22.809695 master-0 kubenswrapper[19170]: I0313 01:44:22.809581 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z9fml"] Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: E0313 01:44:22.810766 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="init" Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: I0313 01:44:22.810785 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="init" Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: E0313 01:44:22.810816 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bed0665-465b-42b5-8413-767afd28078c" containerName="init" Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: I0313 01:44:22.810824 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bed0665-465b-42b5-8413-767afd28078c" containerName="init" Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: E0313 01:44:22.810859 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="dnsmasq-dns" Mar 13 01:44:22.810874 master-0 kubenswrapper[19170]: I0313 01:44:22.810869 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="dnsmasq-dns" Mar 13 01:44:22.811307 master-0 kubenswrapper[19170]: I0313 01:44:22.811146 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cf27610-c39b-47d8-ae28-e5691381bfbb" containerName="dnsmasq-dns" Mar 13 01:44:22.811307 master-0 kubenswrapper[19170]: I0313 01:44:22.811167 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bed0665-465b-42b5-8413-767afd28078c" containerName="init" Mar 13 01:44:22.811908 master-0 kubenswrapper[19170]: I0313 01:44:22.811865 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.814172 master-0 kubenswrapper[19170]: I0313 01:44:22.814125 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 01:44:22.827757 master-0 kubenswrapper[19170]: I0313 01:44:22.827695 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z9fml"] Mar 13 01:44:22.868865 master-0 kubenswrapper[19170]: I0313 01:44:22.868820 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.869159 master-0 kubenswrapper[19170]: I0313 01:44:22.868933 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld7vm\" (UniqueName: \"kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.970761 master-0 kubenswrapper[19170]: I0313 01:44:22.970677 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld7vm\" (UniqueName: \"kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.971017 master-0 kubenswrapper[19170]: I0313 01:44:22.970843 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.972495 master-0 kubenswrapper[19170]: I0313 01:44:22.971663 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:22.993267 master-0 kubenswrapper[19170]: I0313 01:44:22.993198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld7vm\" (UniqueName: \"kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm\") pod \"root-account-create-update-z9fml\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:23.201501 master-0 kubenswrapper[19170]: I0313 01:44:23.201431 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:23.738240 master-0 kubenswrapper[19170]: I0313 01:44:23.737047 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z9fml"] Mar 13 01:44:23.760074 master-0 kubenswrapper[19170]: W0313 01:44:23.760003 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536a6b3f_315f_4544_b5fb_1fbb1c2b778d.slice/crio-1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a WatchSource:0}: Error finding container 1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a: Status 404 returned error can't find the container with id 1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a Mar 13 01:44:24.075314 master-0 kubenswrapper[19170]: I0313 01:44:24.075207 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:24.224673 master-0 kubenswrapper[19170]: I0313 01:44:24.223226 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:44:24.224673 master-0 kubenswrapper[19170]: I0313 01:44:24.223451 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="dnsmasq-dns" containerID="cri-o://b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155" gracePeriod=10 Mar 13 01:44:24.305947 master-0 kubenswrapper[19170]: I0313 01:44:24.305882 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9fml" event={"ID":"536a6b3f-315f-4544-b5fb-1fbb1c2b778d","Type":"ContainerStarted","Data":"085e90ae6d978797e245149b313e40d43e85b75b81158858d76603f04cefddab"} Mar 13 01:44:24.306102 master-0 kubenswrapper[19170]: I0313 01:44:24.305950 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9fml" event={"ID":"536a6b3f-315f-4544-b5fb-1fbb1c2b778d","Type":"ContainerStarted","Data":"1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a"} Mar 13 01:44:24.341012 master-0 kubenswrapper[19170]: I0313 01:44:24.340805 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-z9fml" podStartSLOduration=2.340779768 podStartE2EDuration="2.340779768s" podCreationTimestamp="2026-03-13 01:44:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:24.329083219 +0000 UTC m=+1525.137204179" watchObservedRunningTime="2026-03-13 01:44:24.340779768 +0000 UTC m=+1525.148900738" Mar 13 01:44:24.879236 master-0 kubenswrapper[19170]: I0313 01:44:24.879186 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:44:24.932840 master-0 kubenswrapper[19170]: I0313 01:44:24.931632 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ds5k\" (UniqueName: \"kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k\") pod \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " Mar 13 01:44:24.932840 master-0 kubenswrapper[19170]: I0313 01:44:24.931897 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config\") pod \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " Mar 13 01:44:24.932840 master-0 kubenswrapper[19170]: I0313 01:44:24.931936 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc\") pod \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\" (UID: \"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33\") " Mar 13 01:44:24.937392 master-0 kubenswrapper[19170]: I0313 01:44:24.937329 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k" (OuterVolumeSpecName: "kube-api-access-2ds5k") pod "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" (UID: "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33"). InnerVolumeSpecName "kube-api-access-2ds5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:24.984662 master-0 kubenswrapper[19170]: I0313 01:44:24.984597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config" (OuterVolumeSpecName: "config") pod "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" (UID: "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:25.027142 master-0 kubenswrapper[19170]: I0313 01:44:25.027071 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" (UID: "689b3b6a-c7b8-4600-9b61-bfb9ff1eba33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:25.035270 master-0 kubenswrapper[19170]: I0313 01:44:25.035222 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:25.035270 master-0 kubenswrapper[19170]: I0313 01:44:25.035270 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:25.035453 master-0 kubenswrapper[19170]: I0313 01:44:25.035288 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ds5k\" (UniqueName: \"kubernetes.io/projected/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33-kube-api-access-2ds5k\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:25.315461 master-0 kubenswrapper[19170]: I0313 01:44:25.315410 19170 generic.go:334] "Generic (PLEG): container finished" podID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerID="b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155" exitCode=0 Mar 13 01:44:25.315860 master-0 kubenswrapper[19170]: I0313 01:44:25.315475 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" event={"ID":"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33","Type":"ContainerDied","Data":"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155"} Mar 13 01:44:25.315860 master-0 kubenswrapper[19170]: I0313 01:44:25.315501 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" event={"ID":"689b3b6a-c7b8-4600-9b61-bfb9ff1eba33","Type":"ContainerDied","Data":"ec03b38fd13e59ba75c91d3857f06091791bc53ca1c0413d0e0e5cf6cbdb4fa5"} Mar 13 01:44:25.315860 master-0 kubenswrapper[19170]: I0313 01:44:25.315516 19170 scope.go:117] "RemoveContainer" containerID="b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155" Mar 13 01:44:25.315860 master-0 kubenswrapper[19170]: I0313 01:44:25.315617 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h" Mar 13 01:44:25.317911 master-0 kubenswrapper[19170]: I0313 01:44:25.317886 19170 generic.go:334] "Generic (PLEG): container finished" podID="536a6b3f-315f-4544-b5fb-1fbb1c2b778d" containerID="085e90ae6d978797e245149b313e40d43e85b75b81158858d76603f04cefddab" exitCode=0 Mar 13 01:44:25.317968 master-0 kubenswrapper[19170]: I0313 01:44:25.317933 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9fml" event={"ID":"536a6b3f-315f-4544-b5fb-1fbb1c2b778d","Type":"ContainerDied","Data":"085e90ae6d978797e245149b313e40d43e85b75b81158858d76603f04cefddab"} Mar 13 01:44:25.411845 master-0 kubenswrapper[19170]: I0313 01:44:25.411668 19170 scope.go:117] "RemoveContainer" containerID="918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e" Mar 13 01:44:25.437299 master-0 kubenswrapper[19170]: I0313 01:44:25.437242 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:44:25.437522 master-0 kubenswrapper[19170]: I0313 01:44:25.437304 19170 scope.go:117] "RemoveContainer" containerID="b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155" Mar 13 01:44:25.437829 master-0 kubenswrapper[19170]: E0313 01:44:25.437778 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155\": container with ID starting with b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155 not found: ID does not exist" containerID="b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155" Mar 13 01:44:25.437906 master-0 kubenswrapper[19170]: I0313 01:44:25.437829 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155"} err="failed to get container status \"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155\": rpc error: code = NotFound desc = could not find container \"b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155\": container with ID starting with b44fa36482453123a462b48d1b01202ea060ff040aa652aa9e5aedde3cbaa155 not found: ID does not exist" Mar 13 01:44:25.437906 master-0 kubenswrapper[19170]: I0313 01:44:25.437854 19170 scope.go:117] "RemoveContainer" containerID="918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e" Mar 13 01:44:25.438209 master-0 kubenswrapper[19170]: E0313 01:44:25.438165 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e\": container with ID starting with 918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e not found: ID does not exist" containerID="918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e" Mar 13 01:44:25.438277 master-0 kubenswrapper[19170]: I0313 01:44:25.438205 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e"} err="failed to get container status \"918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e\": rpc error: code = NotFound desc = could not find container \"918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e\": container with ID starting with 918ae0793867409d25dd7c950908c952d2b9a8e6aa731cfb131a8fe9ea587b7e not found: ID does not exist" Mar 13 01:44:25.439898 master-0 kubenswrapper[19170]: I0313 01:44:25.439855 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-bpv5h"] Mar 13 01:44:25.777271 master-0 kubenswrapper[19170]: I0313 01:44:25.777204 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bk4qg"] Mar 13 01:44:25.779240 master-0 kubenswrapper[19170]: E0313 01:44:25.779149 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="init" Mar 13 01:44:25.779310 master-0 kubenswrapper[19170]: I0313 01:44:25.779242 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="init" Mar 13 01:44:25.779430 master-0 kubenswrapper[19170]: E0313 01:44:25.779396 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="dnsmasq-dns" Mar 13 01:44:25.779430 master-0 kubenswrapper[19170]: I0313 01:44:25.779423 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="dnsmasq-dns" Mar 13 01:44:25.780166 master-0 kubenswrapper[19170]: I0313 01:44:25.780119 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" containerName="dnsmasq-dns" Mar 13 01:44:25.788242 master-0 kubenswrapper[19170]: I0313 01:44:25.788189 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bk4qg"] Mar 13 01:44:25.788343 master-0 kubenswrapper[19170]: I0313 01:44:25.788298 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.852554 master-0 kubenswrapper[19170]: I0313 01:44:25.850534 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg6cl\" (UniqueName: \"kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.852554 master-0 kubenswrapper[19170]: I0313 01:44:25.850619 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.924915 master-0 kubenswrapper[19170]: I0313 01:44:25.921506 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d458-account-create-update-g46pf"] Mar 13 01:44:25.926908 master-0 kubenswrapper[19170]: I0313 01:44:25.926861 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:25.929882 master-0 kubenswrapper[19170]: I0313 01:44:25.929827 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 01:44:25.956273 master-0 kubenswrapper[19170]: I0313 01:44:25.956208 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d458-account-create-update-g46pf"] Mar 13 01:44:25.957022 master-0 kubenswrapper[19170]: I0313 01:44:25.956980 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg6cl\" (UniqueName: \"kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.957181 master-0 kubenswrapper[19170]: I0313 01:44:25.957162 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.959028 master-0 kubenswrapper[19170]: I0313 01:44:25.959002 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:25.978791 master-0 kubenswrapper[19170]: I0313 01:44:25.978698 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg6cl\" (UniqueName: \"kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl\") pod \"glance-db-create-bk4qg\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:26.060137 master-0 kubenswrapper[19170]: I0313 01:44:26.059900 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbsh4\" (UniqueName: \"kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.060137 master-0 kubenswrapper[19170]: I0313 01:44:26.060025 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.106581 master-0 kubenswrapper[19170]: I0313 01:44:26.106469 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:26.162751 master-0 kubenswrapper[19170]: I0313 01:44:26.161768 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbsh4\" (UniqueName: \"kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.162751 master-0 kubenswrapper[19170]: I0313 01:44:26.161859 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.163027 master-0 kubenswrapper[19170]: I0313 01:44:26.162937 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.178970 master-0 kubenswrapper[19170]: I0313 01:44:26.178912 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbsh4\" (UniqueName: \"kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4\") pod \"glance-d458-account-create-update-g46pf\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.254878 master-0 kubenswrapper[19170]: I0313 01:44:26.254785 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:26.353990 master-0 kubenswrapper[19170]: I0313 01:44:26.352942 19170 generic.go:334] "Generic (PLEG): container finished" podID="291c0e40-0b9b-4bd3-b8d6-1983878467ae" containerID="6f32b947e19f480059a85cd7f15b5eab2ba57f4183ed5db626e686a7344bc716" exitCode=0 Mar 13 01:44:26.353990 master-0 kubenswrapper[19170]: I0313 01:44:26.353045 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n9bgl" event={"ID":"291c0e40-0b9b-4bd3-b8d6-1983878467ae","Type":"ContainerDied","Data":"6f32b947e19f480059a85cd7f15b5eab2ba57f4183ed5db626e686a7344bc716"} Mar 13 01:44:26.747040 master-0 kubenswrapper[19170]: I0313 01:44:26.746965 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jwmhl"] Mar 13 01:44:26.750725 master-0 kubenswrapper[19170]: I0313 01:44:26.749177 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.785211 master-0 kubenswrapper[19170]: I0313 01:44:26.785170 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.785767 master-0 kubenswrapper[19170]: I0313 01:44:26.785749 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpwpd\" (UniqueName: \"kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.855519 master-0 kubenswrapper[19170]: I0313 01:44:26.855417 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jwmhl"] Mar 13 01:44:26.877674 master-0 kubenswrapper[19170]: I0313 01:44:26.874742 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bk4qg"] Mar 13 01:44:26.892695 master-0 kubenswrapper[19170]: I0313 01:44:26.887141 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpwpd\" (UniqueName: \"kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.892695 master-0 kubenswrapper[19170]: I0313 01:44:26.887277 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.892695 master-0 kubenswrapper[19170]: I0313 01:44:26.888352 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.921038 master-0 kubenswrapper[19170]: I0313 01:44:26.920989 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpwpd\" (UniqueName: \"kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd\") pod \"keystone-db-create-jwmhl\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:26.933435 master-0 kubenswrapper[19170]: W0313 01:44:26.933338 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95fca356_91a3_4e52_aa79_57ad8399523e.slice/crio-a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28 WatchSource:0}: Error finding container a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28: Status 404 returned error can't find the container with id a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28 Mar 13 01:44:26.999793 master-0 kubenswrapper[19170]: I0313 01:44:26.999734 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d458-account-create-update-g46pf"] Mar 13 01:44:27.005428 master-0 kubenswrapper[19170]: I0313 01:44:27.005380 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:27.018507 master-0 kubenswrapper[19170]: I0313 01:44:27.015672 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9356-account-create-update-vvlhq"] Mar 13 01:44:27.018507 master-0 kubenswrapper[19170]: E0313 01:44:27.016325 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536a6b3f-315f-4544-b5fb-1fbb1c2b778d" containerName="mariadb-account-create-update" Mar 13 01:44:27.018507 master-0 kubenswrapper[19170]: I0313 01:44:27.016341 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="536a6b3f-315f-4544-b5fb-1fbb1c2b778d" containerName="mariadb-account-create-update" Mar 13 01:44:27.018507 master-0 kubenswrapper[19170]: I0313 01:44:27.016541 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="536a6b3f-315f-4544-b5fb-1fbb1c2b778d" containerName="mariadb-account-create-update" Mar 13 01:44:27.018507 master-0 kubenswrapper[19170]: I0313 01:44:27.017335 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.019991 master-0 kubenswrapper[19170]: I0313 01:44:27.019302 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 01:44:27.038616 master-0 kubenswrapper[19170]: I0313 01:44:27.037285 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9356-account-create-update-vvlhq"] Mar 13 01:44:27.051263 master-0 kubenswrapper[19170]: I0313 01:44:27.050994 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-msnk6"] Mar 13 01:44:27.052943 master-0 kubenswrapper[19170]: I0313 01:44:27.052910 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.067472 master-0 kubenswrapper[19170]: I0313 01:44:27.065753 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4e02-account-create-update-rvgbp"] Mar 13 01:44:27.068160 master-0 kubenswrapper[19170]: I0313 01:44:27.067751 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.077760 master-0 kubenswrapper[19170]: I0313 01:44:27.068688 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:27.077760 master-0 kubenswrapper[19170]: I0313 01:44:27.069381 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 01:44:27.091301 master-0 kubenswrapper[19170]: I0313 01:44:27.091249 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld7vm\" (UniqueName: \"kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm\") pod \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " Mar 13 01:44:27.091518 master-0 kubenswrapper[19170]: I0313 01:44:27.091356 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts\") pod \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\" (UID: \"536a6b3f-315f-4544-b5fb-1fbb1c2b778d\") " Mar 13 01:44:27.091817 master-0 kubenswrapper[19170]: I0313 01:44:27.091789 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.091865 master-0 kubenswrapper[19170]: I0313 01:44:27.091837 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9svl\" (UniqueName: \"kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.091905 master-0 kubenswrapper[19170]: I0313 01:44:27.091890 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:27.092564 master-0 kubenswrapper[19170]: I0313 01:44:27.092049 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.092564 master-0 kubenswrapper[19170]: I0313 01:44:27.092115 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "536a6b3f-315f-4544-b5fb-1fbb1c2b778d" (UID: "536a6b3f-315f-4544-b5fb-1fbb1c2b778d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:27.092564 master-0 kubenswrapper[19170]: I0313 01:44:27.092514 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4jn\" (UniqueName: \"kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.093209 master-0 kubenswrapper[19170]: I0313 01:44:27.093167 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-msnk6"] Mar 13 01:44:27.093418 master-0 kubenswrapper[19170]: I0313 01:44:27.093383 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:27.096579 master-0 kubenswrapper[19170]: I0313 01:44:27.095060 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm" (OuterVolumeSpecName: "kube-api-access-ld7vm") pod "536a6b3f-315f-4544-b5fb-1fbb1c2b778d" (UID: "536a6b3f-315f-4544-b5fb-1fbb1c2b778d"). InnerVolumeSpecName "kube-api-access-ld7vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:27.096579 master-0 kubenswrapper[19170]: I0313 01:44:27.096526 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6d54c519-9366-45f7-86e2-ebcdd8bb2477-etc-swift\") pod \"swift-storage-0\" (UID: \"6d54c519-9366-45f7-86e2-ebcdd8bb2477\") " pod="openstack/swift-storage-0" Mar 13 01:44:27.103808 master-0 kubenswrapper[19170]: I0313 01:44:27.103741 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4e02-account-create-update-rvgbp"] Mar 13 01:44:27.197838 master-0 kubenswrapper[19170]: I0313 01:44:27.197771 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.198857 master-0 kubenswrapper[19170]: I0313 01:44:27.198771 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9svl\" (UniqueName: \"kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.199023 master-0 kubenswrapper[19170]: I0313 01:44:27.198790 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.199241 master-0 kubenswrapper[19170]: I0313 01:44:27.199194 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqktl\" (UniqueName: \"kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.199285 master-0 kubenswrapper[19170]: I0313 01:44:27.199265 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.199578 master-0 kubenswrapper[19170]: I0313 01:44:27.199522 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.199780 master-0 kubenswrapper[19170]: I0313 01:44:27.199734 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4jn\" (UniqueName: \"kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.199977 master-0 kubenswrapper[19170]: I0313 01:44:27.199952 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld7vm\" (UniqueName: \"kubernetes.io/projected/536a6b3f-315f-4544-b5fb-1fbb1c2b778d-kube-api-access-ld7vm\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:27.200462 master-0 kubenswrapper[19170]: I0313 01:44:27.200426 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.216079 master-0 kubenswrapper[19170]: I0313 01:44:27.216039 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4jn\" (UniqueName: \"kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn\") pod \"placement-db-create-msnk6\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.219968 master-0 kubenswrapper[19170]: I0313 01:44:27.219911 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9svl\" (UniqueName: \"kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl\") pod \"keystone-9356-account-create-update-vvlhq\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.302100 master-0 kubenswrapper[19170]: I0313 01:44:27.302035 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.302100 master-0 kubenswrapper[19170]: I0313 01:44:27.302102 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqktl\" (UniqueName: \"kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.303885 master-0 kubenswrapper[19170]: I0313 01:44:27.302898 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.334556 master-0 kubenswrapper[19170]: I0313 01:44:27.333945 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 01:44:27.343024 master-0 kubenswrapper[19170]: I0313 01:44:27.342977 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqktl\" (UniqueName: \"kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl\") pod \"placement-4e02-account-create-update-rvgbp\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.352701 master-0 kubenswrapper[19170]: I0313 01:44:27.352547 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:27.400548 master-0 kubenswrapper[19170]: I0313 01:44:27.400482 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d458-account-create-update-g46pf" event={"ID":"95fca356-91a3-4e52-aa79-57ad8399523e","Type":"ContainerStarted","Data":"b6b9883b8fe8e3baab1b6c20e74e15d62c77b4d50607b48e54ad6bdbd80ebe46"} Mar 13 01:44:27.400548 master-0 kubenswrapper[19170]: I0313 01:44:27.400551 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d458-account-create-update-g46pf" event={"ID":"95fca356-91a3-4e52-aa79-57ad8399523e","Type":"ContainerStarted","Data":"a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28"} Mar 13 01:44:27.403737 master-0 kubenswrapper[19170]: I0313 01:44:27.403653 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z9fml" event={"ID":"536a6b3f-315f-4544-b5fb-1fbb1c2b778d","Type":"ContainerDied","Data":"1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a"} Mar 13 01:44:27.403737 master-0 kubenswrapper[19170]: I0313 01:44:27.403700 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1555a4a04a4d39910d12a9e1983e461af4f20dd0191c207ac814a49a25c1295a" Mar 13 01:44:27.403860 master-0 kubenswrapper[19170]: I0313 01:44:27.403804 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z9fml" Mar 13 01:44:27.410401 master-0 kubenswrapper[19170]: I0313 01:44:27.410354 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bk4qg" event={"ID":"a9f33b5a-2935-456d-9895-8fd0285664a5","Type":"ContainerStarted","Data":"002abbc3dc5a162f738c3056d3ac824556be01ea96298260c2f8409bff5b9464"} Mar 13 01:44:27.410606 master-0 kubenswrapper[19170]: I0313 01:44:27.410409 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bk4qg" event={"ID":"a9f33b5a-2935-456d-9895-8fd0285664a5","Type":"ContainerStarted","Data":"96efb1ea97847f8da142274308dd677b2da77adbab4547318418b6313000064a"} Mar 13 01:44:27.424356 master-0 kubenswrapper[19170]: I0313 01:44:27.424280 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d458-account-create-update-g46pf" podStartSLOduration=2.424260875 podStartE2EDuration="2.424260875s" podCreationTimestamp="2026-03-13 01:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:27.417813133 +0000 UTC m=+1528.225934103" watchObservedRunningTime="2026-03-13 01:44:27.424260875 +0000 UTC m=+1528.232381825" Mar 13 01:44:27.430304 master-0 kubenswrapper[19170]: I0313 01:44:27.430246 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689b3b6a-c7b8-4600-9b61-bfb9ff1eba33" path="/var/lib/kubelet/pods/689b3b6a-c7b8-4600-9b61-bfb9ff1eba33/volumes" Mar 13 01:44:27.450952 master-0 kubenswrapper[19170]: I0313 01:44:27.450655 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-bk4qg" podStartSLOduration=2.450620388 podStartE2EDuration="2.450620388s" podCreationTimestamp="2026-03-13 01:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:27.442334574 +0000 UTC m=+1528.250455534" watchObservedRunningTime="2026-03-13 01:44:27.450620388 +0000 UTC m=+1528.258741348" Mar 13 01:44:27.461029 master-0 kubenswrapper[19170]: I0313 01:44:27.460982 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-msnk6" Mar 13 01:44:27.478068 master-0 kubenswrapper[19170]: I0313 01:44:27.478003 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:27.492933 master-0 kubenswrapper[19170]: I0313 01:44:27.492860 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jwmhl"] Mar 13 01:44:27.558751 master-0 kubenswrapper[19170]: W0313 01:44:27.556518 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68585267_5574_4c61_98c1_9ddeb2015743.slice/crio-d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172 WatchSource:0}: Error finding container d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172: Status 404 returned error can't find the container with id d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172 Mar 13 01:44:27.920126 master-0 kubenswrapper[19170]: I0313 01:44:27.919831 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9356-account-create-update-vvlhq"] Mar 13 01:44:28.131322 master-0 kubenswrapper[19170]: I0313 01:44:28.131286 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:28.205817 master-0 kubenswrapper[19170]: I0313 01:44:28.205743 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 01:44:28.270489 master-0 kubenswrapper[19170]: I0313 01:44:28.270449 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.270814 master-0 kubenswrapper[19170]: I0313 01:44:28.270791 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.270919 master-0 kubenswrapper[19170]: I0313 01:44:28.270906 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.271030 master-0 kubenswrapper[19170]: I0313 01:44:28.271018 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.271180 master-0 kubenswrapper[19170]: I0313 01:44:28.271165 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66pw5\" (UniqueName: \"kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.271302 master-0 kubenswrapper[19170]: I0313 01:44:28.271289 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.272092 master-0 kubenswrapper[19170]: I0313 01:44:28.272075 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift\") pod \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\" (UID: \"291c0e40-0b9b-4bd3-b8d6-1983878467ae\") " Mar 13 01:44:28.274586 master-0 kubenswrapper[19170]: I0313 01:44:28.274544 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:28.275305 master-0 kubenswrapper[19170]: I0313 01:44:28.275274 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:44:28.277191 master-0 kubenswrapper[19170]: I0313 01:44:28.277162 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5" (OuterVolumeSpecName: "kube-api-access-66pw5") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "kube-api-access-66pw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:28.279905 master-0 kubenswrapper[19170]: I0313 01:44:28.279851 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:44:28.298863 master-0 kubenswrapper[19170]: I0313 01:44:28.298817 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts" (OuterVolumeSpecName: "scripts") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:28.315543 master-0 kubenswrapper[19170]: I0313 01:44:28.313820 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:44:28.318116 master-0 kubenswrapper[19170]: I0313 01:44:28.318054 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4e02-account-create-update-rvgbp"] Mar 13 01:44:28.319762 master-0 kubenswrapper[19170]: I0313 01:44:28.319597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "291c0e40-0b9b-4bd3-b8d6-1983878467ae" (UID: "291c0e40-0b9b-4bd3-b8d6-1983878467ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:44:28.328052 master-0 kubenswrapper[19170]: I0313 01:44:28.327764 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-msnk6"] Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374793 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66pw5\" (UniqueName: \"kubernetes.io/projected/291c0e40-0b9b-4bd3-b8d6-1983878467ae-kube-api-access-66pw5\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374836 19170 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374847 19170 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/291c0e40-0b9b-4bd3-b8d6-1983878467ae-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374857 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374865 19170 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/291c0e40-0b9b-4bd3-b8d6-1983878467ae-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374873 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.374900 master-0 kubenswrapper[19170]: I0313 01:44:28.374882 19170 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/291c0e40-0b9b-4bd3-b8d6-1983878467ae-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:28.428474 master-0 kubenswrapper[19170]: I0313 01:44:28.428407 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9356-account-create-update-vvlhq" event={"ID":"20df490e-cdef-41c3-b9c2-b9724a4d5ac9","Type":"ContainerStarted","Data":"c38dd269a8e6f012e23e880761e9f17ee3d40d6a8cbdff4d9d7ece1dc67b2672"} Mar 13 01:44:28.428474 master-0 kubenswrapper[19170]: I0313 01:44:28.428471 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9356-account-create-update-vvlhq" event={"ID":"20df490e-cdef-41c3-b9c2-b9724a4d5ac9","Type":"ContainerStarted","Data":"73d387b8c9da2f80f0766bf87ba3bc93af79e9ffcbb491dc9913c33b87c186b4"} Mar 13 01:44:28.431588 master-0 kubenswrapper[19170]: I0313 01:44:28.431471 19170 generic.go:334] "Generic (PLEG): container finished" podID="68585267-5574-4c61-98c1-9ddeb2015743" containerID="ff34afbc847f6d96b0e2fc02d647e01be4a8d9289ccf14689635dd04ad0d4f50" exitCode=0 Mar 13 01:44:28.431820 master-0 kubenswrapper[19170]: I0313 01:44:28.431760 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jwmhl" event={"ID":"68585267-5574-4c61-98c1-9ddeb2015743","Type":"ContainerDied","Data":"ff34afbc847f6d96b0e2fc02d647e01be4a8d9289ccf14689635dd04ad0d4f50"} Mar 13 01:44:28.431820 master-0 kubenswrapper[19170]: I0313 01:44:28.431795 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jwmhl" event={"ID":"68585267-5574-4c61-98c1-9ddeb2015743","Type":"ContainerStarted","Data":"d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172"} Mar 13 01:44:28.435006 master-0 kubenswrapper[19170]: I0313 01:44:28.434965 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4e02-account-create-update-rvgbp" event={"ID":"c35d0626-7620-40f1-b7ca-d6eef1cc775e","Type":"ContainerStarted","Data":"5ee5b7caf54270d0bcc0118c4c8a3b22c6b9a6260667d33c94b37bf5b3302e8d"} Mar 13 01:44:28.437364 master-0 kubenswrapper[19170]: I0313 01:44:28.437319 19170 generic.go:334] "Generic (PLEG): container finished" podID="a9f33b5a-2935-456d-9895-8fd0285664a5" containerID="002abbc3dc5a162f738c3056d3ac824556be01ea96298260c2f8409bff5b9464" exitCode=0 Mar 13 01:44:28.437435 master-0 kubenswrapper[19170]: I0313 01:44:28.437382 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bk4qg" event={"ID":"a9f33b5a-2935-456d-9895-8fd0285664a5","Type":"ContainerDied","Data":"002abbc3dc5a162f738c3056d3ac824556be01ea96298260c2f8409bff5b9464"} Mar 13 01:44:28.439328 master-0 kubenswrapper[19170]: I0313 01:44:28.439288 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-n9bgl" event={"ID":"291c0e40-0b9b-4bd3-b8d6-1983878467ae","Type":"ContainerDied","Data":"3d9a15510e1f852b6520b9cb6cdd65c36bd69ecb1629fde588e31a15c5d91bf0"} Mar 13 01:44:28.439328 master-0 kubenswrapper[19170]: I0313 01:44:28.439319 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d9a15510e1f852b6520b9cb6cdd65c36bd69ecb1629fde588e31a15c5d91bf0" Mar 13 01:44:28.439458 master-0 kubenswrapper[19170]: I0313 01:44:28.439364 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-n9bgl" Mar 13 01:44:28.445848 master-0 kubenswrapper[19170]: I0313 01:44:28.445667 19170 generic.go:334] "Generic (PLEG): container finished" podID="95fca356-91a3-4e52-aa79-57ad8399523e" containerID="b6b9883b8fe8e3baab1b6c20e74e15d62c77b4d50607b48e54ad6bdbd80ebe46" exitCode=0 Mar 13 01:44:28.446198 master-0 kubenswrapper[19170]: I0313 01:44:28.446082 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d458-account-create-update-g46pf" event={"ID":"95fca356-91a3-4e52-aa79-57ad8399523e","Type":"ContainerDied","Data":"b6b9883b8fe8e3baab1b6c20e74e15d62c77b4d50607b48e54ad6bdbd80ebe46"} Mar 13 01:44:28.453667 master-0 kubenswrapper[19170]: I0313 01:44:28.453566 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-9356-account-create-update-vvlhq" podStartSLOduration=2.4535440570000002 podStartE2EDuration="2.453544057s" podCreationTimestamp="2026-03-13 01:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:28.446670784 +0000 UTC m=+1529.254791764" watchObservedRunningTime="2026-03-13 01:44:28.453544057 +0000 UTC m=+1529.261665017" Mar 13 01:44:28.483265 master-0 kubenswrapper[19170]: I0313 01:44:28.482723 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-msnk6" event={"ID":"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb","Type":"ContainerStarted","Data":"c167cc2c589f81e3579b64d493a67624c0e82ca9f82925eaab1c504f35766851"} Mar 13 01:44:28.486878 master-0 kubenswrapper[19170]: I0313 01:44:28.486529 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"53736c6eb85c5f071d1ddb254a1a4ad117524a590717128ce4c75590ddce1eb2"} Mar 13 01:44:29.017978 master-0 kubenswrapper[19170]: I0313 01:44:29.017908 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z9fml"] Mar 13 01:44:29.027909 master-0 kubenswrapper[19170]: I0313 01:44:29.027850 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z9fml"] Mar 13 01:44:29.438769 master-0 kubenswrapper[19170]: I0313 01:44:29.438454 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536a6b3f-315f-4544-b5fb-1fbb1c2b778d" path="/var/lib/kubelet/pods/536a6b3f-315f-4544-b5fb-1fbb1c2b778d/volumes" Mar 13 01:44:29.514983 master-0 kubenswrapper[19170]: I0313 01:44:29.514916 19170 generic.go:334] "Generic (PLEG): container finished" podID="86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" containerID="b57312d3c768d8eb9d34c792df07d4a2da85c1d5eec889cfd6a726e54b5a5f28" exitCode=0 Mar 13 01:44:29.515172 master-0 kubenswrapper[19170]: I0313 01:44:29.515054 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-msnk6" event={"ID":"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb","Type":"ContainerDied","Data":"b57312d3c768d8eb9d34c792df07d4a2da85c1d5eec889cfd6a726e54b5a5f28"} Mar 13 01:44:29.517920 master-0 kubenswrapper[19170]: I0313 01:44:29.517868 19170 generic.go:334] "Generic (PLEG): container finished" podID="20df490e-cdef-41c3-b9c2-b9724a4d5ac9" containerID="c38dd269a8e6f012e23e880761e9f17ee3d40d6a8cbdff4d9d7ece1dc67b2672" exitCode=0 Mar 13 01:44:29.517994 master-0 kubenswrapper[19170]: I0313 01:44:29.517958 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9356-account-create-update-vvlhq" event={"ID":"20df490e-cdef-41c3-b9c2-b9724a4d5ac9","Type":"ContainerDied","Data":"c38dd269a8e6f012e23e880761e9f17ee3d40d6a8cbdff4d9d7ece1dc67b2672"} Mar 13 01:44:29.520670 master-0 kubenswrapper[19170]: I0313 01:44:29.520620 19170 generic.go:334] "Generic (PLEG): container finished" podID="c35d0626-7620-40f1-b7ca-d6eef1cc775e" containerID="6d79364d9b459805a7ffb5f97119be585d7688dc75827c0badbe274109c73222" exitCode=0 Mar 13 01:44:29.520727 master-0 kubenswrapper[19170]: I0313 01:44:29.520674 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4e02-account-create-update-rvgbp" event={"ID":"c35d0626-7620-40f1-b7ca-d6eef1cc775e","Type":"ContainerDied","Data":"6d79364d9b459805a7ffb5f97119be585d7688dc75827c0badbe274109c73222"} Mar 13 01:44:30.099820 master-0 kubenswrapper[19170]: I0313 01:44:30.099778 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:30.192011 master-0 kubenswrapper[19170]: I0313 01:44:30.189620 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:30.271759 master-0 kubenswrapper[19170]: I0313 01:44:30.269674 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:30.271759 master-0 kubenswrapper[19170]: I0313 01:44:30.269848 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts\") pod \"68585267-5574-4c61-98c1-9ddeb2015743\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " Mar 13 01:44:30.271759 master-0 kubenswrapper[19170]: I0313 01:44:30.269946 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpwpd\" (UniqueName: \"kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd\") pod \"68585267-5574-4c61-98c1-9ddeb2015743\" (UID: \"68585267-5574-4c61-98c1-9ddeb2015743\") " Mar 13 01:44:30.275878 master-0 kubenswrapper[19170]: I0313 01:44:30.272061 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68585267-5574-4c61-98c1-9ddeb2015743" (UID: "68585267-5574-4c61-98c1-9ddeb2015743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:30.275878 master-0 kubenswrapper[19170]: I0313 01:44:30.275828 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd" (OuterVolumeSpecName: "kube-api-access-fpwpd") pod "68585267-5574-4c61-98c1-9ddeb2015743" (UID: "68585267-5574-4c61-98c1-9ddeb2015743"). InnerVolumeSpecName "kube-api-access-fpwpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:30.372279 master-0 kubenswrapper[19170]: I0313 01:44:30.372149 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbsh4\" (UniqueName: \"kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4\") pod \"95fca356-91a3-4e52-aa79-57ad8399523e\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " Mar 13 01:44:30.372784 master-0 kubenswrapper[19170]: I0313 01:44:30.372751 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts\") pod \"a9f33b5a-2935-456d-9895-8fd0285664a5\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " Mar 13 01:44:30.372890 master-0 kubenswrapper[19170]: I0313 01:44:30.372867 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts\") pod \"95fca356-91a3-4e52-aa79-57ad8399523e\" (UID: \"95fca356-91a3-4e52-aa79-57ad8399523e\") " Mar 13 01:44:30.373011 master-0 kubenswrapper[19170]: I0313 01:44:30.372991 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg6cl\" (UniqueName: \"kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl\") pod \"a9f33b5a-2935-456d-9895-8fd0285664a5\" (UID: \"a9f33b5a-2935-456d-9895-8fd0285664a5\") " Mar 13 01:44:30.374031 master-0 kubenswrapper[19170]: I0313 01:44:30.373260 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9f33b5a-2935-456d-9895-8fd0285664a5" (UID: "a9f33b5a-2935-456d-9895-8fd0285664a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:30.374031 master-0 kubenswrapper[19170]: I0313 01:44:30.373555 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68585267-5574-4c61-98c1-9ddeb2015743-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.374031 master-0 kubenswrapper[19170]: I0313 01:44:30.373935 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpwpd\" (UniqueName: \"kubernetes.io/projected/68585267-5574-4c61-98c1-9ddeb2015743-kube-api-access-fpwpd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.374031 master-0 kubenswrapper[19170]: I0313 01:44:30.374002 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95fca356-91a3-4e52-aa79-57ad8399523e" (UID: "95fca356-91a3-4e52-aa79-57ad8399523e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:30.377492 master-0 kubenswrapper[19170]: I0313 01:44:30.377455 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4" (OuterVolumeSpecName: "kube-api-access-bbsh4") pod "95fca356-91a3-4e52-aa79-57ad8399523e" (UID: "95fca356-91a3-4e52-aa79-57ad8399523e"). InnerVolumeSpecName "kube-api-access-bbsh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:30.378402 master-0 kubenswrapper[19170]: I0313 01:44:30.378368 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl" (OuterVolumeSpecName: "kube-api-access-bg6cl") pod "a9f33b5a-2935-456d-9895-8fd0285664a5" (UID: "a9f33b5a-2935-456d-9895-8fd0285664a5"). InnerVolumeSpecName "kube-api-access-bg6cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:30.490773 master-0 kubenswrapper[19170]: I0313 01:44:30.490690 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg6cl\" (UniqueName: \"kubernetes.io/projected/a9f33b5a-2935-456d-9895-8fd0285664a5-kube-api-access-bg6cl\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.490773 master-0 kubenswrapper[19170]: I0313 01:44:30.490736 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbsh4\" (UniqueName: \"kubernetes.io/projected/95fca356-91a3-4e52-aa79-57ad8399523e-kube-api-access-bbsh4\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.490773 master-0 kubenswrapper[19170]: I0313 01:44:30.490751 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9f33b5a-2935-456d-9895-8fd0285664a5-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.490773 master-0 kubenswrapper[19170]: I0313 01:44:30.490764 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95fca356-91a3-4e52-aa79-57ad8399523e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:30.533432 master-0 kubenswrapper[19170]: I0313 01:44:30.533345 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bk4qg" Mar 13 01:44:30.533814 master-0 kubenswrapper[19170]: I0313 01:44:30.533766 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bk4qg" event={"ID":"a9f33b5a-2935-456d-9895-8fd0285664a5","Type":"ContainerDied","Data":"96efb1ea97847f8da142274308dd677b2da77adbab4547318418b6313000064a"} Mar 13 01:44:30.533896 master-0 kubenswrapper[19170]: I0313 01:44:30.533813 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96efb1ea97847f8da142274308dd677b2da77adbab4547318418b6313000064a" Mar 13 01:44:30.535459 master-0 kubenswrapper[19170]: I0313 01:44:30.535407 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d458-account-create-update-g46pf" event={"ID":"95fca356-91a3-4e52-aa79-57ad8399523e","Type":"ContainerDied","Data":"a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28"} Mar 13 01:44:30.535459 master-0 kubenswrapper[19170]: I0313 01:44:30.535458 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a0b584b04210045a55a73f098fddbce42e668b3d843511908d33494fc1bf28" Mar 13 01:44:30.535592 master-0 kubenswrapper[19170]: I0313 01:44:30.535425 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d458-account-create-update-g46pf" Mar 13 01:44:30.539796 master-0 kubenswrapper[19170]: I0313 01:44:30.539753 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"e1927efae40776c89d8a2a5ca57e64e347c9ac4d7e7b507cf9bdb68f6c247c3d"} Mar 13 01:44:30.539932 master-0 kubenswrapper[19170]: I0313 01:44:30.539912 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"ab28edd51377a2c21160463b1bec2118d6ba9b868c589f8a3d73c5606f82f039"} Mar 13 01:44:30.540021 master-0 kubenswrapper[19170]: I0313 01:44:30.540003 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"2a309c821e7667c843a73939adb459fd27986e8324b289fabc5df86961b7c3db"} Mar 13 01:44:30.540291 master-0 kubenswrapper[19170]: I0313 01:44:30.540265 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"7e364bf454a1c0dd6368da6302dedcd173decf215a4b7ecfc98b42383c8c45c8"} Mar 13 01:44:30.543127 master-0 kubenswrapper[19170]: I0313 01:44:30.543097 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jwmhl" event={"ID":"68585267-5574-4c61-98c1-9ddeb2015743","Type":"ContainerDied","Data":"d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172"} Mar 13 01:44:30.543127 master-0 kubenswrapper[19170]: I0313 01:44:30.543129 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d598f3941d1ff85b08971c8a93cc31672d4b252a2f6377d5ad1ddbab601a5172" Mar 13 01:44:30.544855 master-0 kubenswrapper[19170]: I0313 01:44:30.544829 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jwmhl" Mar 13 01:44:31.094444 master-0 kubenswrapper[19170]: I0313 01:44:31.094341 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-msnk6" Mar 13 01:44:31.203606 master-0 kubenswrapper[19170]: I0313 01:44:31.203573 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:31.209353 master-0 kubenswrapper[19170]: I0313 01:44:31.209320 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts\") pod \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " Mar 13 01:44:31.209459 master-0 kubenswrapper[19170]: I0313 01:44:31.209399 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts\") pod \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " Mar 13 01:44:31.209459 master-0 kubenswrapper[19170]: I0313 01:44:31.209421 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px4jn\" (UniqueName: \"kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn\") pod \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\" (UID: \"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb\") " Mar 13 01:44:31.209556 master-0 kubenswrapper[19170]: I0313 01:44:31.209502 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9svl\" (UniqueName: \"kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl\") pod \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\" (UID: \"20df490e-cdef-41c3-b9c2-b9724a4d5ac9\") " Mar 13 01:44:31.209849 master-0 kubenswrapper[19170]: I0313 01:44:31.209806 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20df490e-cdef-41c3-b9c2-b9724a4d5ac9" (UID: "20df490e-cdef-41c3-b9c2-b9724a4d5ac9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:31.210033 master-0 kubenswrapper[19170]: I0313 01:44:31.210011 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:31.210561 master-0 kubenswrapper[19170]: I0313 01:44:31.210512 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" (UID: "86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:31.212532 master-0 kubenswrapper[19170]: I0313 01:44:31.212500 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl" (OuterVolumeSpecName: "kube-api-access-k9svl") pod "20df490e-cdef-41c3-b9c2-b9724a4d5ac9" (UID: "20df490e-cdef-41c3-b9c2-b9724a4d5ac9"). InnerVolumeSpecName "kube-api-access-k9svl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:31.215313 master-0 kubenswrapper[19170]: I0313 01:44:31.215078 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn" (OuterVolumeSpecName: "kube-api-access-px4jn") pod "86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" (UID: "86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb"). InnerVolumeSpecName "kube-api-access-px4jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:31.332277 master-0 kubenswrapper[19170]: I0313 01:44:31.317810 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.332277 master-0 kubenswrapper[19170]: I0313 01:44:31.317844 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.332277 master-0 kubenswrapper[19170]: I0313 01:44:31.317854 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px4jn\" (UniqueName: \"kubernetes.io/projected/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb-kube-api-access-px4jn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.332277 master-0 kubenswrapper[19170]: I0313 01:44:31.317863 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9svl\" (UniqueName: \"kubernetes.io/projected/20df490e-cdef-41c3-b9c2-b9724a4d5ac9-kube-api-access-k9svl\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.419560 master-0 kubenswrapper[19170]: I0313 01:44:31.419505 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts\") pod \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " Mar 13 01:44:31.419776 master-0 kubenswrapper[19170]: I0313 01:44:31.419705 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqktl\" (UniqueName: \"kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl\") pod \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\" (UID: \"c35d0626-7620-40f1-b7ca-d6eef1cc775e\") " Mar 13 01:44:31.420590 master-0 kubenswrapper[19170]: I0313 01:44:31.420550 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c35d0626-7620-40f1-b7ca-d6eef1cc775e" (UID: "c35d0626-7620-40f1-b7ca-d6eef1cc775e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:31.420982 master-0 kubenswrapper[19170]: I0313 01:44:31.420955 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c35d0626-7620-40f1-b7ca-d6eef1cc775e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.429437 master-0 kubenswrapper[19170]: I0313 01:44:31.429402 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl" (OuterVolumeSpecName: "kube-api-access-tqktl") pod "c35d0626-7620-40f1-b7ca-d6eef1cc775e" (UID: "c35d0626-7620-40f1-b7ca-d6eef1cc775e"). InnerVolumeSpecName "kube-api-access-tqktl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:31.522759 master-0 kubenswrapper[19170]: I0313 01:44:31.522717 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqktl\" (UniqueName: \"kubernetes.io/projected/c35d0626-7620-40f1-b7ca-d6eef1cc775e-kube-api-access-tqktl\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:31.559457 master-0 kubenswrapper[19170]: I0313 01:44:31.559391 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-msnk6" event={"ID":"86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb","Type":"ContainerDied","Data":"c167cc2c589f81e3579b64d493a67624c0e82ca9f82925eaab1c504f35766851"} Mar 13 01:44:31.559457 master-0 kubenswrapper[19170]: I0313 01:44:31.559419 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-msnk6" Mar 13 01:44:31.559856 master-0 kubenswrapper[19170]: I0313 01:44:31.559434 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c167cc2c589f81e3579b64d493a67624c0e82ca9f82925eaab1c504f35766851" Mar 13 01:44:31.561430 master-0 kubenswrapper[19170]: I0313 01:44:31.561394 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9356-account-create-update-vvlhq" event={"ID":"20df490e-cdef-41c3-b9c2-b9724a4d5ac9","Type":"ContainerDied","Data":"73d387b8c9da2f80f0766bf87ba3bc93af79e9ffcbb491dc9913c33b87c186b4"} Mar 13 01:44:31.561430 master-0 kubenswrapper[19170]: I0313 01:44:31.561417 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d387b8c9da2f80f0766bf87ba3bc93af79e9ffcbb491dc9913c33b87c186b4" Mar 13 01:44:31.561672 master-0 kubenswrapper[19170]: I0313 01:44:31.561471 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9356-account-create-update-vvlhq" Mar 13 01:44:31.565337 master-0 kubenswrapper[19170]: I0313 01:44:31.565209 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4e02-account-create-update-rvgbp" event={"ID":"c35d0626-7620-40f1-b7ca-d6eef1cc775e","Type":"ContainerDied","Data":"5ee5b7caf54270d0bcc0118c4c8a3b22c6b9a6260667d33c94b37bf5b3302e8d"} Mar 13 01:44:31.565938 master-0 kubenswrapper[19170]: I0313 01:44:31.565872 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee5b7caf54270d0bcc0118c4c8a3b22c6b9a6260667d33c94b37bf5b3302e8d" Mar 13 01:44:31.566113 master-0 kubenswrapper[19170]: I0313 01:44:31.565316 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4e02-account-create-update-rvgbp" Mar 13 01:44:32.579992 master-0 kubenswrapper[19170]: I0313 01:44:32.579860 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"917b461368acbc2a77181f26493723c7e4b7e84331c9acf0deb18a6c6984d820"} Mar 13 01:44:32.579992 master-0 kubenswrapper[19170]: I0313 01:44:32.579912 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"aeb5c2f3de983dd6d5ecb18c704b148979f96a5acc518e98a7b1cb50104ee981"} Mar 13 01:44:32.579992 master-0 kubenswrapper[19170]: I0313 01:44:32.579924 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"cad343dbc3f9405cdb4b960f127d37676e33ca3af96f2dc6ae8d48c377a5f850"} Mar 13 01:44:32.579992 master-0 kubenswrapper[19170]: I0313 01:44:32.579932 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"190bab0d02e7af47c1473e5cd0e4b4293177cc7e79abd4708a299ffae00758c5"} Mar 13 01:44:34.040556 master-0 kubenswrapper[19170]: I0313 01:44:34.040476 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-spk4p"] Mar 13 01:44:34.041522 master-0 kubenswrapper[19170]: E0313 01:44:34.041469 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f33b5a-2935-456d-9895-8fd0285664a5" containerName="mariadb-database-create" Mar 13 01:44:34.041522 master-0 kubenswrapper[19170]: I0313 01:44:34.041517 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f33b5a-2935-456d-9895-8fd0285664a5" containerName="mariadb-database-create" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: E0313 01:44:34.041539 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" containerName="mariadb-database-create" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: I0313 01:44:34.041556 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" containerName="mariadb-database-create" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: E0313 01:44:34.041597 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291c0e40-0b9b-4bd3-b8d6-1983878467ae" containerName="swift-ring-rebalance" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: I0313 01:44:34.041611 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="291c0e40-0b9b-4bd3-b8d6-1983878467ae" containerName="swift-ring-rebalance" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: E0313 01:44:34.041678 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95fca356-91a3-4e52-aa79-57ad8399523e" containerName="mariadb-account-create-update" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: I0313 01:44:34.041692 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="95fca356-91a3-4e52-aa79-57ad8399523e" containerName="mariadb-account-create-update" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: E0313 01:44:34.041738 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68585267-5574-4c61-98c1-9ddeb2015743" containerName="mariadb-database-create" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: I0313 01:44:34.041751 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="68585267-5574-4c61-98c1-9ddeb2015743" containerName="mariadb-database-create" Mar 13 01:44:34.041765 master-0 kubenswrapper[19170]: E0313 01:44:34.041777 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20df490e-cdef-41c3-b9c2-b9724a4d5ac9" containerName="mariadb-account-create-update" Mar 13 01:44:34.042263 master-0 kubenswrapper[19170]: I0313 01:44:34.041790 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="20df490e-cdef-41c3-b9c2-b9724a4d5ac9" containerName="mariadb-account-create-update" Mar 13 01:44:34.042263 master-0 kubenswrapper[19170]: E0313 01:44:34.041834 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35d0626-7620-40f1-b7ca-d6eef1cc775e" containerName="mariadb-account-create-update" Mar 13 01:44:34.042263 master-0 kubenswrapper[19170]: I0313 01:44:34.041847 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35d0626-7620-40f1-b7ca-d6eef1cc775e" containerName="mariadb-account-create-update" Mar 13 01:44:34.042263 master-0 kubenswrapper[19170]: I0313 01:44:34.042232 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="291c0e40-0b9b-4bd3-b8d6-1983878467ae" containerName="swift-ring-rebalance" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042280 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35d0626-7620-40f1-b7ca-d6eef1cc775e" containerName="mariadb-account-create-update" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042306 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="20df490e-cdef-41c3-b9c2-b9724a4d5ac9" containerName="mariadb-account-create-update" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042334 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f33b5a-2935-456d-9895-8fd0285664a5" containerName="mariadb-database-create" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042365 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="68585267-5574-4c61-98c1-9ddeb2015743" containerName="mariadb-database-create" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042390 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="95fca356-91a3-4e52-aa79-57ad8399523e" containerName="mariadb-account-create-update" Mar 13 01:44:34.042439 master-0 kubenswrapper[19170]: I0313 01:44:34.042405 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" containerName="mariadb-database-create" Mar 13 01:44:34.043673 master-0 kubenswrapper[19170]: I0313 01:44:34.043574 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.052500 master-0 kubenswrapper[19170]: I0313 01:44:34.048105 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 01:44:34.078415 master-0 kubenswrapper[19170]: I0313 01:44:34.057577 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-spk4p"] Mar 13 01:44:34.184559 master-0 kubenswrapper[19170]: I0313 01:44:34.184506 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.184761 master-0 kubenswrapper[19170]: I0313 01:44:34.184607 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5nkq\" (UniqueName: \"kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.286940 master-0 kubenswrapper[19170]: I0313 01:44:34.286871 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.287077 master-0 kubenswrapper[19170]: I0313 01:44:34.287054 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5nkq\" (UniqueName: \"kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.287673 master-0 kubenswrapper[19170]: I0313 01:44:34.287609 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.303489 master-0 kubenswrapper[19170]: I0313 01:44:34.303433 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5nkq\" (UniqueName: \"kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq\") pod \"root-account-create-update-spk4p\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.441329 master-0 kubenswrapper[19170]: I0313 01:44:34.441249 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:34.636758 master-0 kubenswrapper[19170]: I0313 01:44:34.628955 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"4815cbf49cc8df5ac6c4063157b24de122b01f1059933a7a1936110c582a5601"} Mar 13 01:44:34.636758 master-0 kubenswrapper[19170]: I0313 01:44:34.629013 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"5fa89db4ad81bc842cffa15b66038b552079ca0133a5d62cb97e4c7d7c09003b"} Mar 13 01:44:34.636758 master-0 kubenswrapper[19170]: I0313 01:44:34.629027 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"1ef6a7db2c0adc689b536b369facbfcf28f73926c944628aaeef5d48b8c73ac7"} Mar 13 01:44:34.636758 master-0 kubenswrapper[19170]: I0313 01:44:34.629038 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"ac2370ba56837edf8b5958eb40d666baee22731ce7eba4e1c8fddaef77b7d492"} Mar 13 01:44:34.943456 master-0 kubenswrapper[19170]: W0313 01:44:34.943369 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63edfcff_8055_449f_9dd1_3d210972a805.slice/crio-a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad WatchSource:0}: Error finding container a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad: Status 404 returned error can't find the container with id a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad Mar 13 01:44:34.969072 master-0 kubenswrapper[19170]: I0313 01:44:34.968988 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-spk4p"] Mar 13 01:44:35.643480 master-0 kubenswrapper[19170]: I0313 01:44:35.643325 19170 generic.go:334] "Generic (PLEG): container finished" podID="63edfcff-8055-449f-9dd1-3d210972a805" containerID="eb06e65eaa26cf301e50db054a3bbd3911d80cf69148e95f0f395dd50f2b407a" exitCode=0 Mar 13 01:44:35.643480 master-0 kubenswrapper[19170]: I0313 01:44:35.643421 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-spk4p" event={"ID":"63edfcff-8055-449f-9dd1-3d210972a805","Type":"ContainerDied","Data":"eb06e65eaa26cf301e50db054a3bbd3911d80cf69148e95f0f395dd50f2b407a"} Mar 13 01:44:35.643480 master-0 kubenswrapper[19170]: I0313 01:44:35.643462 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-spk4p" event={"ID":"63edfcff-8055-449f-9dd1-3d210972a805","Type":"ContainerStarted","Data":"a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad"} Mar 13 01:44:35.653193 master-0 kubenswrapper[19170]: I0313 01:44:35.653111 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"1a732a7f7fc0d88998e86891a63300f78b70947a56bfc2b0180af7fbe22a9bce"} Mar 13 01:44:35.653193 master-0 kubenswrapper[19170]: I0313 01:44:35.653177 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"6d012e836441c16d04b8ab6c6d7d593aebc78ae3606c49dda09775485932f976"} Mar 13 01:44:35.653193 master-0 kubenswrapper[19170]: I0313 01:44:35.653188 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6d54c519-9366-45f7-86e2-ebcdd8bb2477","Type":"ContainerStarted","Data":"5faf01b044e986b86315d01783f3f91c65e233d178503648adc0d419e017d0c9"} Mar 13 01:44:35.726826 master-0 kubenswrapper[19170]: I0313 01:44:35.726688 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.359841211 podStartE2EDuration="27.726659148s" podCreationTimestamp="2026-03-13 01:44:08 +0000 UTC" firstStartedPulling="2026-03-13 01:44:28.207567994 +0000 UTC m=+1529.015688974" lastFinishedPulling="2026-03-13 01:44:33.574385931 +0000 UTC m=+1534.382506911" observedRunningTime="2026-03-13 01:44:35.70653682 +0000 UTC m=+1536.514657780" watchObservedRunningTime="2026-03-13 01:44:35.726659148 +0000 UTC m=+1536.534780138" Mar 13 01:44:35.955677 master-0 kubenswrapper[19170]: I0313 01:44:35.953666 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-bvpk6"] Mar 13 01:44:35.955677 master-0 kubenswrapper[19170]: I0313 01:44:35.955281 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:35.962219 master-0 kubenswrapper[19170]: I0313 01:44:35.962054 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-config-data" Mar 13 01:44:35.985717 master-0 kubenswrapper[19170]: I0313 01:44:35.982538 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bvpk6"] Mar 13 01:44:36.037792 master-0 kubenswrapper[19170]: I0313 01:44:36.037582 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 01:44:36.116575 master-0 kubenswrapper[19170]: I0313 01:44:36.114824 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:44:36.117237 master-0 kubenswrapper[19170]: I0313 01:44:36.117203 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.129280 master-0 kubenswrapper[19170]: I0313 01:44:36.129088 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 01:44:36.129703 master-0 kubenswrapper[19170]: I0313 01:44:36.129529 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:44:36.134455 master-0 kubenswrapper[19170]: I0313 01:44:36.134393 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.134780 master-0 kubenswrapper[19170]: I0313 01:44:36.134742 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.135000 master-0 kubenswrapper[19170]: I0313 01:44:36.134970 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.135047 master-0 kubenswrapper[19170]: I0313 01:44:36.135002 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjr7x\" (UniqueName: \"kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.236724 master-0 kubenswrapper[19170]: I0313 01:44:36.236569 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.236724 master-0 kubenswrapper[19170]: I0313 01:44:36.236705 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.236976 master-0 kubenswrapper[19170]: I0313 01:44:36.236778 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzwt7\" (UniqueName: \"kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.236976 master-0 kubenswrapper[19170]: I0313 01:44:36.236836 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.236976 master-0 kubenswrapper[19170]: I0313 01:44:36.236929 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjr7x\" (UniqueName: \"kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.236976 master-0 kubenswrapper[19170]: I0313 01:44:36.236957 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.237149 master-0 kubenswrapper[19170]: I0313 01:44:36.236986 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.237149 master-0 kubenswrapper[19170]: I0313 01:44:36.237027 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.237149 master-0 kubenswrapper[19170]: I0313 01:44:36.237079 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.237149 master-0 kubenswrapper[19170]: I0313 01:44:36.237109 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.241390 master-0 kubenswrapper[19170]: I0313 01:44:36.241352 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.242362 master-0 kubenswrapper[19170]: I0313 01:44:36.242261 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.245404 master-0 kubenswrapper[19170]: I0313 01:44:36.245203 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.259496 master-0 kubenswrapper[19170]: I0313 01:44:36.259432 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjr7x\" (UniqueName: \"kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x\") pod \"glance-db-sync-bvpk6\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.312398 master-0 kubenswrapper[19170]: I0313 01:44:36.312334 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bvpk6" Mar 13 01:44:36.339726 master-0 kubenswrapper[19170]: I0313 01:44:36.339678 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340015 master-0 kubenswrapper[19170]: I0313 01:44:36.339996 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340171 master-0 kubenswrapper[19170]: I0313 01:44:36.340152 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340317 master-0 kubenswrapper[19170]: I0313 01:44:36.340301 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340470 master-0 kubenswrapper[19170]: I0313 01:44:36.340452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzwt7\" (UniqueName: \"kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340623 master-0 kubenswrapper[19170]: I0313 01:44:36.340605 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.340778 master-0 kubenswrapper[19170]: I0313 01:44:36.340749 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.341125 master-0 kubenswrapper[19170]: I0313 01:44:36.341084 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.341257 master-0 kubenswrapper[19170]: I0313 01:44:36.341209 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.341519 master-0 kubenswrapper[19170]: I0313 01:44:36.341470 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.341519 master-0 kubenswrapper[19170]: I0313 01:44:36.341497 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.360831 master-0 kubenswrapper[19170]: I0313 01:44:36.360600 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzwt7\" (UniqueName: \"kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7\") pod \"dnsmasq-dns-76986c7db5-cp72m\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.442934 master-0 kubenswrapper[19170]: I0313 01:44:36.442759 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:36.528313 master-0 kubenswrapper[19170]: I0313 01:44:36.528232 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gvjkt" podUID="db2c7f91-4b25-4e56-9d4a-ce6a885121f9" containerName="ovn-controller" probeResult="failure" output=< Mar 13 01:44:36.528313 master-0 kubenswrapper[19170]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 01:44:36.528313 master-0 kubenswrapper[19170]: > Mar 13 01:44:36.946204 master-0 kubenswrapper[19170]: I0313 01:44:36.946033 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bvpk6"] Mar 13 01:44:36.983038 master-0 kubenswrapper[19170]: I0313 01:44:36.982796 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:44:37.000613 master-0 kubenswrapper[19170]: W0313 01:44:37.000557 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8679fcd5_4aee_4742_8f0f_fa761f7f5b88.slice/crio-620aeb8240c3c5fd35392c7249faef1b781c228bde7aacd7f3a86e3b3f6af909 WatchSource:0}: Error finding container 620aeb8240c3c5fd35392c7249faef1b781c228bde7aacd7f3a86e3b3f6af909: Status 404 returned error can't find the container with id 620aeb8240c3c5fd35392c7249faef1b781c228bde7aacd7f3a86e3b3f6af909 Mar 13 01:44:37.148937 master-0 kubenswrapper[19170]: I0313 01:44:37.148790 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:37.271712 master-0 kubenswrapper[19170]: I0313 01:44:37.270478 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts\") pod \"63edfcff-8055-449f-9dd1-3d210972a805\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " Mar 13 01:44:37.271712 master-0 kubenswrapper[19170]: I0313 01:44:37.270551 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5nkq\" (UniqueName: \"kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq\") pod \"63edfcff-8055-449f-9dd1-3d210972a805\" (UID: \"63edfcff-8055-449f-9dd1-3d210972a805\") " Mar 13 01:44:37.271712 master-0 kubenswrapper[19170]: I0313 01:44:37.271049 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63edfcff-8055-449f-9dd1-3d210972a805" (UID: "63edfcff-8055-449f-9dd1-3d210972a805"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:37.271712 master-0 kubenswrapper[19170]: I0313 01:44:37.271307 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63edfcff-8055-449f-9dd1-3d210972a805-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:37.274226 master-0 kubenswrapper[19170]: I0313 01:44:37.274171 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq" (OuterVolumeSpecName: "kube-api-access-p5nkq") pod "63edfcff-8055-449f-9dd1-3d210972a805" (UID: "63edfcff-8055-449f-9dd1-3d210972a805"). InnerVolumeSpecName "kube-api-access-p5nkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:37.374256 master-0 kubenswrapper[19170]: I0313 01:44:37.374110 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5nkq\" (UniqueName: \"kubernetes.io/projected/63edfcff-8055-449f-9dd1-3d210972a805-kube-api-access-p5nkq\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:37.686557 master-0 kubenswrapper[19170]: I0313 01:44:37.686488 19170 generic.go:334] "Generic (PLEG): container finished" podID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerID="8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84" exitCode=0 Mar 13 01:44:37.686891 master-0 kubenswrapper[19170]: I0313 01:44:37.686593 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" event={"ID":"8679fcd5-4aee-4742-8f0f-fa761f7f5b88","Type":"ContainerDied","Data":"8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84"} Mar 13 01:44:37.686891 master-0 kubenswrapper[19170]: I0313 01:44:37.686648 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" event={"ID":"8679fcd5-4aee-4742-8f0f-fa761f7f5b88","Type":"ContainerStarted","Data":"620aeb8240c3c5fd35392c7249faef1b781c228bde7aacd7f3a86e3b3f6af909"} Mar 13 01:44:37.693457 master-0 kubenswrapper[19170]: I0313 01:44:37.693371 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bvpk6" event={"ID":"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a","Type":"ContainerStarted","Data":"04c2b01ab38012d6fd32707d0e449e3b8bd4c9ebf44b79faf7eabc0bcdecc26c"} Mar 13 01:44:37.698247 master-0 kubenswrapper[19170]: I0313 01:44:37.698184 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-spk4p" event={"ID":"63edfcff-8055-449f-9dd1-3d210972a805","Type":"ContainerDied","Data":"a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad"} Mar 13 01:44:37.698247 master-0 kubenswrapper[19170]: I0313 01:44:37.698243 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f811441552ef3dcfdfb1fc4acd38801f0a70dfbd38270e512e91a73e5773ad" Mar 13 01:44:37.698486 master-0 kubenswrapper[19170]: I0313 01:44:37.698308 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-spk4p" Mar 13 01:44:38.711035 master-0 kubenswrapper[19170]: I0313 01:44:38.710914 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" event={"ID":"8679fcd5-4aee-4742-8f0f-fa761f7f5b88","Type":"ContainerStarted","Data":"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35"} Mar 13 01:44:38.711604 master-0 kubenswrapper[19170]: I0313 01:44:38.711367 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:38.714710 master-0 kubenswrapper[19170]: I0313 01:44:38.714689 19170 generic.go:334] "Generic (PLEG): container finished" podID="0707bd16-09db-4d83-af8a-f8e7b78fad40" containerID="bb6bffbc30d351de963311b4228c6f118e59823900a8efd9445d02d2fe1a59f0" exitCode=0 Mar 13 01:44:38.714829 master-0 kubenswrapper[19170]: I0313 01:44:38.714815 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0707bd16-09db-4d83-af8a-f8e7b78fad40","Type":"ContainerDied","Data":"bb6bffbc30d351de963311b4228c6f118e59823900a8efd9445d02d2fe1a59f0"} Mar 13 01:44:38.720055 master-0 kubenswrapper[19170]: I0313 01:44:38.719472 19170 generic.go:334] "Generic (PLEG): container finished" podID="e9844d28-b0df-4563-a746-76fde502f19a" containerID="457f1a43de4ea348a7aa7ead6c14fa234cefc27e0071dda81f902416e3163f34" exitCode=0 Mar 13 01:44:38.720055 master-0 kubenswrapper[19170]: I0313 01:44:38.719509 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9844d28-b0df-4563-a746-76fde502f19a","Type":"ContainerDied","Data":"457f1a43de4ea348a7aa7ead6c14fa234cefc27e0071dda81f902416e3163f34"} Mar 13 01:44:38.730569 master-0 kubenswrapper[19170]: I0313 01:44:38.730519 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" podStartSLOduration=2.7305045679999997 podStartE2EDuration="2.730504568s" podCreationTimestamp="2026-03-13 01:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:38.728703617 +0000 UTC m=+1539.536824577" watchObservedRunningTime="2026-03-13 01:44:38.730504568 +0000 UTC m=+1539.538625528" Mar 13 01:44:39.732499 master-0 kubenswrapper[19170]: I0313 01:44:39.731081 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0707bd16-09db-4d83-af8a-f8e7b78fad40","Type":"ContainerStarted","Data":"64f90c922910ddc8392a1d8bed71679e049ae93fa70b306ae77737b02b87fcd7"} Mar 13 01:44:39.733481 master-0 kubenswrapper[19170]: I0313 01:44:39.733173 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 01:44:39.740648 master-0 kubenswrapper[19170]: I0313 01:44:39.736810 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9844d28-b0df-4563-a746-76fde502f19a","Type":"ContainerStarted","Data":"3f4a3882a6e6480d93b0fe4994f5190ff35fa7659e8ea5e03d8e59e1cd075437"} Mar 13 01:44:39.740648 master-0 kubenswrapper[19170]: I0313 01:44:39.737672 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:44:39.770473 master-0 kubenswrapper[19170]: I0313 01:44:39.770417 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.407216101 podStartE2EDuration="58.77040189s" podCreationTimestamp="2026-03-13 01:43:41 +0000 UTC" firstStartedPulling="2026-03-13 01:43:57.700656993 +0000 UTC m=+1498.508777953" lastFinishedPulling="2026-03-13 01:44:05.063842782 +0000 UTC m=+1505.871963742" observedRunningTime="2026-03-13 01:44:39.768746053 +0000 UTC m=+1540.576867013" watchObservedRunningTime="2026-03-13 01:44:39.77040189 +0000 UTC m=+1540.578522850" Mar 13 01:44:39.792969 master-0 kubenswrapper[19170]: I0313 01:44:39.792900 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.516903933 podStartE2EDuration="58.792881474s" podCreationTimestamp="2026-03-13 01:43:41 +0000 UTC" firstStartedPulling="2026-03-13 01:43:57.551613522 +0000 UTC m=+1498.359734482" lastFinishedPulling="2026-03-13 01:44:04.827591063 +0000 UTC m=+1505.635712023" observedRunningTime="2026-03-13 01:44:39.789964381 +0000 UTC m=+1540.598085341" watchObservedRunningTime="2026-03-13 01:44:39.792881474 +0000 UTC m=+1540.601002434" Mar 13 01:44:41.499903 master-0 kubenswrapper[19170]: I0313 01:44:41.499218 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gvjkt" podUID="db2c7f91-4b25-4e56-9d4a-ce6a885121f9" containerName="ovn-controller" probeResult="failure" output=< Mar 13 01:44:41.499903 master-0 kubenswrapper[19170]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 01:44:41.499903 master-0 kubenswrapper[19170]: > Mar 13 01:44:41.508116 master-0 kubenswrapper[19170]: I0313 01:44:41.507946 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:44:41.525746 master-0 kubenswrapper[19170]: I0313 01:44:41.525692 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-p9j9l" Mar 13 01:44:41.782004 master-0 kubenswrapper[19170]: I0313 01:44:41.781882 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gvjkt-config-rgk9d"] Mar 13 01:44:41.782383 master-0 kubenswrapper[19170]: E0313 01:44:41.782366 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63edfcff-8055-449f-9dd1-3d210972a805" containerName="mariadb-account-create-update" Mar 13 01:44:41.782383 master-0 kubenswrapper[19170]: I0313 01:44:41.782381 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="63edfcff-8055-449f-9dd1-3d210972a805" containerName="mariadb-account-create-update" Mar 13 01:44:41.782701 master-0 kubenswrapper[19170]: I0313 01:44:41.782674 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="63edfcff-8055-449f-9dd1-3d210972a805" containerName="mariadb-account-create-update" Mar 13 01:44:41.783485 master-0 kubenswrapper[19170]: I0313 01:44:41.783432 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.795536 master-0 kubenswrapper[19170]: I0313 01:44:41.795373 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt-config-rgk9d"] Mar 13 01:44:41.828963 master-0 kubenswrapper[19170]: I0313 01:44:41.827655 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 01:44:41.930335 master-0 kubenswrapper[19170]: I0313 01:44:41.930260 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.930552 master-0 kubenswrapper[19170]: I0313 01:44:41.930457 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.930652 master-0 kubenswrapper[19170]: I0313 01:44:41.930586 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.930720 master-0 kubenswrapper[19170]: I0313 01:44:41.930697 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.931026 master-0 kubenswrapper[19170]: I0313 01:44:41.930853 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:41.931026 master-0 kubenswrapper[19170]: I0313 01:44:41.930881 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033089 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033161 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033242 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033290 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033352 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033410 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033619 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033839 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.033924 master-0 kubenswrapper[19170]: I0313 01:44:42.033849 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.034468 master-0 kubenswrapper[19170]: I0313 01:44:42.033964 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.039788 master-0 kubenswrapper[19170]: I0313 01:44:42.039711 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.055460 master-0 kubenswrapper[19170]: I0313 01:44:42.053421 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv\") pod \"ovn-controller-gvjkt-config-rgk9d\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.191984 master-0 kubenswrapper[19170]: I0313 01:44:42.191916 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:42.680727 master-0 kubenswrapper[19170]: I0313 01:44:42.679384 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt-config-rgk9d"] Mar 13 01:44:42.773509 master-0 kubenswrapper[19170]: I0313 01:44:42.773455 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-rgk9d" event={"ID":"6dc119d6-19fa-4bb5-9989-8f759c6daefc","Type":"ContainerStarted","Data":"cfe6a9bd2da9ce56199dbd8fc0e9318098aeb707fd2b5ce7e7af65bcb9f534ed"} Mar 13 01:44:43.785581 master-0 kubenswrapper[19170]: I0313 01:44:43.785510 19170 generic.go:334] "Generic (PLEG): container finished" podID="6dc119d6-19fa-4bb5-9989-8f759c6daefc" containerID="d5fa74686248cc290070e6e4e8e692e240e2b869fad245862ba0fa15e0ba6388" exitCode=0 Mar 13 01:44:43.785581 master-0 kubenswrapper[19170]: I0313 01:44:43.785567 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-rgk9d" event={"ID":"6dc119d6-19fa-4bb5-9989-8f759c6daefc","Type":"ContainerDied","Data":"d5fa74686248cc290070e6e4e8e692e240e2b869fad245862ba0fa15e0ba6388"} Mar 13 01:44:46.444900 master-0 kubenswrapper[19170]: I0313 01:44:46.444839 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:44:46.506872 master-0 kubenswrapper[19170]: I0313 01:44:46.506817 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gvjkt" Mar 13 01:44:46.995545 master-0 kubenswrapper[19170]: I0313 01:44:46.995216 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:46.995545 master-0 kubenswrapper[19170]: I0313 01:44:46.995534 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="dnsmasq-dns" containerID="cri-o://0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261" gracePeriod=10 Mar 13 01:44:48.780926 master-0 kubenswrapper[19170]: I0313 01:44:48.780854 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 01:44:49.076959 master-0 kubenswrapper[19170]: I0313 01:44:49.076742 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.189:5353: connect: connection refused" Mar 13 01:44:49.107952 master-0 kubenswrapper[19170]: I0313 01:44:49.105102 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ckzdn"] Mar 13 01:44:49.107952 master-0 kubenswrapper[19170]: I0313 01:44:49.106350 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.129682 master-0 kubenswrapper[19170]: I0313 01:44:49.126594 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ckzdn"] Mar 13 01:44:49.214162 master-0 kubenswrapper[19170]: I0313 01:44:49.210782 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bf5z\" (UniqueName: \"kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.214162 master-0 kubenswrapper[19170]: I0313 01:44:49.213766 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.254475 master-0 kubenswrapper[19170]: I0313 01:44:49.254207 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7877-account-create-update-9ffg6"] Mar 13 01:44:49.260606 master-0 kubenswrapper[19170]: I0313 01:44:49.258990 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.264714 master-0 kubenswrapper[19170]: I0313 01:44:49.264015 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 01:44:49.272341 master-0 kubenswrapper[19170]: I0313 01:44:49.269710 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7877-account-create-update-9ffg6"] Mar 13 01:44:49.316742 master-0 kubenswrapper[19170]: I0313 01:44:49.316572 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.317427 master-0 kubenswrapper[19170]: I0313 01:44:49.317372 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bf5z\" (UniqueName: \"kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.317799 master-0 kubenswrapper[19170]: I0313 01:44:49.317683 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.338548 master-0 kubenswrapper[19170]: I0313 01:44:49.338510 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bf5z\" (UniqueName: \"kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z\") pod \"cinder-db-create-ckzdn\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.424945 master-0 kubenswrapper[19170]: I0313 01:44:49.424882 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.425176 master-0 kubenswrapper[19170]: I0313 01:44:49.425154 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6brlg\" (UniqueName: \"kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.508672 master-0 kubenswrapper[19170]: I0313 01:44:49.505773 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-klmnb"] Mar 13 01:44:49.508672 master-0 kubenswrapper[19170]: I0313 01:44:49.507105 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.528781 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.528905 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6brlg\" (UniqueName: \"kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.528966 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74skk\" (UniqueName: \"kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.529007 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.530063 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-klmnb"] Mar 13 01:44:49.535747 master-0 kubenswrapper[19170]: I0313 01:44:49.531212 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.536117 master-0 kubenswrapper[19170]: I0313 01:44:49.536070 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:49.569804 master-0 kubenswrapper[19170]: I0313 01:44:49.568967 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6brlg\" (UniqueName: \"kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg\") pod \"cinder-7877-account-create-update-9ffg6\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.578280 master-0 kubenswrapper[19170]: I0313 01:44:49.578221 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:49.620722 master-0 kubenswrapper[19170]: I0313 01:44:49.618921 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-snmt2"] Mar 13 01:44:49.620722 master-0 kubenswrapper[19170]: I0313 01:44:49.620582 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.625701 master-0 kubenswrapper[19170]: I0313 01:44:49.624587 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-snmt2"] Mar 13 01:44:49.628226 master-0 kubenswrapper[19170]: I0313 01:44:49.628106 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 01:44:49.632704 master-0 kubenswrapper[19170]: I0313 01:44:49.628316 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 01:44:49.632704 master-0 kubenswrapper[19170]: I0313 01:44:49.628442 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 01:44:49.636731 master-0 kubenswrapper[19170]: I0313 01:44:49.636126 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.636731 master-0 kubenswrapper[19170]: I0313 01:44:49.636185 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.636731 master-0 kubenswrapper[19170]: I0313 01:44:49.636245 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8vc\" (UniqueName: \"kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.636731 master-0 kubenswrapper[19170]: I0313 01:44:49.636286 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74skk\" (UniqueName: \"kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.636731 master-0 kubenswrapper[19170]: I0313 01:44:49.636318 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.638341 master-0 kubenswrapper[19170]: I0313 01:44:49.637005 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.640001 master-0 kubenswrapper[19170]: I0313 01:44:49.639850 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0ee1-account-create-update-pr2qk"] Mar 13 01:44:49.641252 master-0 kubenswrapper[19170]: I0313 01:44:49.641231 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.657736 master-0 kubenswrapper[19170]: I0313 01:44:49.654217 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0ee1-account-create-update-pr2qk"] Mar 13 01:44:49.665724 master-0 kubenswrapper[19170]: I0313 01:44:49.662707 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 01:44:49.684870 master-0 kubenswrapper[19170]: I0313 01:44:49.684796 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74skk\" (UniqueName: \"kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk\") pod \"neutron-db-create-klmnb\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.739184 master-0 kubenswrapper[19170]: I0313 01:44:49.739145 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.739297 master-0 kubenswrapper[19170]: I0313 01:44:49.739204 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.739336 master-0 kubenswrapper[19170]: I0313 01:44:49.739307 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8vc\" (UniqueName: \"kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.744830 master-0 kubenswrapper[19170]: I0313 01:44:49.744779 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.757766 master-0 kubenswrapper[19170]: I0313 01:44:49.755461 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.757766 master-0 kubenswrapper[19170]: I0313 01:44:49.756608 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8vc\" (UniqueName: \"kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc\") pod \"keystone-db-sync-snmt2\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:49.841788 master-0 kubenswrapper[19170]: I0313 01:44:49.841686 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.841788 master-0 kubenswrapper[19170]: I0313 01:44:49.841777 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57g8\" (UniqueName: \"kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.930834 master-0 kubenswrapper[19170]: I0313 01:44:49.930738 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:49.944370 master-0 kubenswrapper[19170]: I0313 01:44:49.944292 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.944370 master-0 kubenswrapper[19170]: I0313 01:44:49.944385 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57g8\" (UniqueName: \"kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.945389 master-0 kubenswrapper[19170]: I0313 01:44:49.945357 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:49.963299 master-0 kubenswrapper[19170]: I0313 01:44:49.963260 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-snmt2" Mar 13 01:44:50.038322 master-0 kubenswrapper[19170]: I0313 01:44:50.038280 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57g8\" (UniqueName: \"kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8\") pod \"neutron-0ee1-account-create-update-pr2qk\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:50.328541 master-0 kubenswrapper[19170]: I0313 01:44:50.328021 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:52.417266 master-0 kubenswrapper[19170]: I0313 01:44:52.417206 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:52.536512 master-0 kubenswrapper[19170]: I0313 01:44:52.536424 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.536512 master-0 kubenswrapper[19170]: I0313 01:44:52.536484 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.536821 master-0 kubenswrapper[19170]: I0313 01:44:52.536564 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.536821 master-0 kubenswrapper[19170]: I0313 01:44:52.536615 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.536821 master-0 kubenswrapper[19170]: I0313 01:44:52.536639 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.536821 master-0 kubenswrapper[19170]: I0313 01:44:52.536705 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn\") pod \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\" (UID: \"6dc119d6-19fa-4bb5-9989-8f759c6daefc\") " Mar 13 01:44:52.539865 master-0 kubenswrapper[19170]: I0313 01:44:52.539814 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:52.539865 master-0 kubenswrapper[19170]: I0313 01:44:52.539868 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:52.545465 master-0 kubenswrapper[19170]: I0313 01:44:52.543739 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts" (OuterVolumeSpecName: "scripts") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:52.545465 master-0 kubenswrapper[19170]: I0313 01:44:52.543788 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run" (OuterVolumeSpecName: "var-run") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:52.545465 master-0 kubenswrapper[19170]: I0313 01:44:52.543809 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:52.601056 master-0 kubenswrapper[19170]: I0313 01:44:52.600915 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv" (OuterVolumeSpecName: "kube-api-access-qk6pv") pod "6dc119d6-19fa-4bb5-9989-8f759c6daefc" (UID: "6dc119d6-19fa-4bb5-9989-8f759c6daefc"). InnerVolumeSpecName "kube-api-access-qk6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643152 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6dc119d6-19fa-4bb5-9989-8f759c6daefc-kube-api-access-qk6pv\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643200 19170 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643212 19170 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643222 19170 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643230 19170 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6dc119d6-19fa-4bb5-9989-8f759c6daefc-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.643723 master-0 kubenswrapper[19170]: I0313 01:44:52.643239 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6dc119d6-19fa-4bb5-9989-8f759c6daefc-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:52.821096 master-0 kubenswrapper[19170]: I0313 01:44:52.821052 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:52.940271 master-0 kubenswrapper[19170]: I0313 01:44:52.939399 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-rgk9d" event={"ID":"6dc119d6-19fa-4bb5-9989-8f759c6daefc","Type":"ContainerDied","Data":"cfe6a9bd2da9ce56199dbd8fc0e9318098aeb707fd2b5ce7e7af65bcb9f534ed"} Mar 13 01:44:52.940271 master-0 kubenswrapper[19170]: I0313 01:44:52.939438 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe6a9bd2da9ce56199dbd8fc0e9318098aeb707fd2b5ce7e7af65bcb9f534ed" Mar 13 01:44:52.940271 master-0 kubenswrapper[19170]: I0313 01:44:52.939494 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-rgk9d" Mar 13 01:44:52.945944 master-0 kubenswrapper[19170]: I0313 01:44:52.944803 19170 generic.go:334] "Generic (PLEG): container finished" podID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerID="0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261" exitCode=0 Mar 13 01:44:52.945944 master-0 kubenswrapper[19170]: I0313 01:44:52.944850 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" event={"ID":"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b","Type":"ContainerDied","Data":"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261"} Mar 13 01:44:52.945944 master-0 kubenswrapper[19170]: I0313 01:44:52.944879 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" event={"ID":"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b","Type":"ContainerDied","Data":"b08d16da1d51cb153bf053222a1e4a880477c9c24197ddc6f6bc51a85aeb45d4"} Mar 13 01:44:52.945944 master-0 kubenswrapper[19170]: I0313 01:44:52.944896 19170 scope.go:117] "RemoveContainer" containerID="0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261" Mar 13 01:44:52.945944 master-0 kubenswrapper[19170]: I0313 01:44:52.945035 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-ljb9g" Mar 13 01:44:52.954494 master-0 kubenswrapper[19170]: I0313 01:44:52.951987 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config\") pod \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " Mar 13 01:44:52.954494 master-0 kubenswrapper[19170]: I0313 01:44:52.952128 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb\") pod \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " Mar 13 01:44:52.954494 master-0 kubenswrapper[19170]: I0313 01:44:52.952219 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc\") pod \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " Mar 13 01:44:52.954494 master-0 kubenswrapper[19170]: I0313 01:44:52.952310 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb\") pod \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " Mar 13 01:44:52.954494 master-0 kubenswrapper[19170]: I0313 01:44:52.952346 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6x8\" (UniqueName: \"kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8\") pod \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\" (UID: \"55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b\") " Mar 13 01:44:52.990172 master-0 kubenswrapper[19170]: I0313 01:44:52.983484 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8" (OuterVolumeSpecName: "kube-api-access-qj6x8") pod "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" (UID: "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b"). InnerVolumeSpecName "kube-api-access-qj6x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:53.006566 master-0 kubenswrapper[19170]: I0313 01:44:53.006527 19170 scope.go:117] "RemoveContainer" containerID="02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394" Mar 13 01:44:53.050404 master-0 kubenswrapper[19170]: I0313 01:44:53.050364 19170 scope.go:117] "RemoveContainer" containerID="0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261" Mar 13 01:44:53.051630 master-0 kubenswrapper[19170]: E0313 01:44:53.050899 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261\": container with ID starting with 0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261 not found: ID does not exist" containerID="0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261" Mar 13 01:44:53.051630 master-0 kubenswrapper[19170]: I0313 01:44:53.050930 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261"} err="failed to get container status \"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261\": rpc error: code = NotFound desc = could not find container \"0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261\": container with ID starting with 0d82351f63d0ab803727023253fabb13522a1338dabec64bb79add14abe96261 not found: ID does not exist" Mar 13 01:44:53.051630 master-0 kubenswrapper[19170]: I0313 01:44:53.050951 19170 scope.go:117] "RemoveContainer" containerID="02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394" Mar 13 01:44:53.051630 master-0 kubenswrapper[19170]: E0313 01:44:53.051590 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394\": container with ID starting with 02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394 not found: ID does not exist" containerID="02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394" Mar 13 01:44:53.051630 master-0 kubenswrapper[19170]: I0313 01:44:53.051623 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394"} err="failed to get container status \"02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394\": rpc error: code = NotFound desc = could not find container \"02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394\": container with ID starting with 02cdd6f32ee65067323abe3e1e90e4c8c07158bba64fabd407ef891ee5771394 not found: ID does not exist" Mar 13 01:44:53.065754 master-0 kubenswrapper[19170]: I0313 01:44:53.065519 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6x8\" (UniqueName: \"kubernetes.io/projected/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-kube-api-access-qj6x8\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:53.098586 master-0 kubenswrapper[19170]: I0313 01:44:53.095240 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" (UID: "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:53.098586 master-0 kubenswrapper[19170]: I0313 01:44:53.097415 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" (UID: "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:53.121274 master-0 kubenswrapper[19170]: I0313 01:44:53.099970 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" (UID: "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:53.157793 master-0 kubenswrapper[19170]: W0313 01:44:53.157662 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cbf04b2_2903_4475_aaf3_74cb9d85e3bd.slice/crio-167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e WatchSource:0}: Error finding container 167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e: Status 404 returned error can't find the container with id 167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e Mar 13 01:44:53.167306 master-0 kubenswrapper[19170]: I0313 01:44:53.164737 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ckzdn"] Mar 13 01:44:53.168178 master-0 kubenswrapper[19170]: I0313 01:44:53.168099 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:53.168178 master-0 kubenswrapper[19170]: I0313 01:44:53.168142 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:53.168178 master-0 kubenswrapper[19170]: I0313 01:44:53.168155 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:53.194595 master-0 kubenswrapper[19170]: I0313 01:44:53.194500 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config" (OuterVolumeSpecName: "config") pod "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" (UID: "55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:53.269812 master-0 kubenswrapper[19170]: I0313 01:44:53.269702 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:53.358591 master-0 kubenswrapper[19170]: I0313 01:44:53.358024 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:53.364745 master-0 kubenswrapper[19170]: I0313 01:44:53.364548 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-ljb9g"] Mar 13 01:44:53.499813 master-0 kubenswrapper[19170]: I0313 01:44:53.498777 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" path="/var/lib/kubelet/pods/55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b/volumes" Mar 13 01:44:53.499813 master-0 kubenswrapper[19170]: I0313 01:44:53.499422 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-klmnb"] Mar 13 01:44:53.578759 master-0 kubenswrapper[19170]: I0313 01:44:53.578687 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0ee1-account-create-update-pr2qk"] Mar 13 01:44:53.599029 master-0 kubenswrapper[19170]: W0313 01:44:53.589258 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4516736b_101c_43e8_8492_bda9281e78f9.slice/crio-f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd WatchSource:0}: Error finding container f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd: Status 404 returned error can't find the container with id f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd Mar 13 01:44:53.599029 master-0 kubenswrapper[19170]: I0313 01:44:53.591871 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7877-account-create-update-9ffg6"] Mar 13 01:44:53.745968 master-0 kubenswrapper[19170]: I0313 01:44:53.745922 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gvjkt-config-rgk9d"] Mar 13 01:44:53.759438 master-0 kubenswrapper[19170]: I0313 01:44:53.759372 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gvjkt-config-rgk9d"] Mar 13 01:44:53.770834 master-0 kubenswrapper[19170]: I0313 01:44:53.770759 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-snmt2"] Mar 13 01:44:53.787949 master-0 kubenswrapper[19170]: I0313 01:44:53.787905 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gvjkt-config-gdsw7"] Mar 13 01:44:53.788367 master-0 kubenswrapper[19170]: E0313 01:44:53.788344 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="init" Mar 13 01:44:53.788367 master-0 kubenswrapper[19170]: I0313 01:44:53.788364 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="init" Mar 13 01:44:53.788452 master-0 kubenswrapper[19170]: E0313 01:44:53.788404 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc119d6-19fa-4bb5-9989-8f759c6daefc" containerName="ovn-config" Mar 13 01:44:53.788452 master-0 kubenswrapper[19170]: I0313 01:44:53.788411 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc119d6-19fa-4bb5-9989-8f759c6daefc" containerName="ovn-config" Mar 13 01:44:53.788452 master-0 kubenswrapper[19170]: E0313 01:44:53.788429 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="dnsmasq-dns" Mar 13 01:44:53.788452 master-0 kubenswrapper[19170]: I0313 01:44:53.788435 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="dnsmasq-dns" Mar 13 01:44:53.788696 master-0 kubenswrapper[19170]: I0313 01:44:53.788677 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc119d6-19fa-4bb5-9989-8f759c6daefc" containerName="ovn-config" Mar 13 01:44:53.788749 master-0 kubenswrapper[19170]: I0313 01:44:53.788699 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e0a8e0-bf5b-4aaa-bff7-de32dc6b325b" containerName="dnsmasq-dns" Mar 13 01:44:53.789343 master-0 kubenswrapper[19170]: I0313 01:44:53.789322 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.792402 master-0 kubenswrapper[19170]: I0313 01:44:53.792370 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 01:44:53.797366 master-0 kubenswrapper[19170]: I0313 01:44:53.797330 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt-config-gdsw7"] Mar 13 01:44:53.884207 master-0 kubenswrapper[19170]: I0313 01:44:53.884131 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn8cz\" (UniqueName: \"kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.884305 master-0 kubenswrapper[19170]: I0313 01:44:53.884223 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.884305 master-0 kubenswrapper[19170]: I0313 01:44:53.884263 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.884371 master-0 kubenswrapper[19170]: I0313 01:44:53.884306 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.884371 master-0 kubenswrapper[19170]: I0313 01:44:53.884325 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.884434 master-0 kubenswrapper[19170]: I0313 01:44:53.884414 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.986680 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn8cz\" (UniqueName: \"kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.986829 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.986909 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.986996 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.987023 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.987228 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.989293 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.989357 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.991019 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.991860 master-0 kubenswrapper[19170]: I0313 01:44:53.991070 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:53.995600 master-0 kubenswrapper[19170]: I0313 01:44:53.995544 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7877-account-create-update-9ffg6" event={"ID":"4516736b-101c-43e8-8492-bda9281e78f9","Type":"ContainerStarted","Data":"f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd"} Mar 13 01:44:54.004654 master-0 kubenswrapper[19170]: I0313 01:44:53.998246 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bvpk6" event={"ID":"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a","Type":"ContainerStarted","Data":"0e0f1c1b9a2e40bbc6b0ab512acedb270840f7d4aa0034ff33c77d8b0162c860"} Mar 13 01:44:54.004654 master-0 kubenswrapper[19170]: I0313 01:44:53.999332 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:54.004654 master-0 kubenswrapper[19170]: I0313 01:44:54.003448 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-snmt2" event={"ID":"1d6788e1-16bb-4b4a-bb65-9e53915b9e74","Type":"ContainerStarted","Data":"5ccb6c9e39b5dfe14096dda4e0e9b78f7d08c69a31d0fcf82759a02704b8c46f"} Mar 13 01:44:54.019173 master-0 kubenswrapper[19170]: I0313 01:44:54.018369 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-klmnb" event={"ID":"48f21b3f-8e3e-4f83-8262-ff7a97273897","Type":"ContainerStarted","Data":"8b7652cd02c3b227b0ec60387fb04a000423d0c5e46396197f0f375a7afc6cf0"} Mar 13 01:44:54.019173 master-0 kubenswrapper[19170]: I0313 01:44:54.018442 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-klmnb" event={"ID":"48f21b3f-8e3e-4f83-8262-ff7a97273897","Type":"ContainerStarted","Data":"feb9bd0ca69b0ec172a9ddb5b522a858972b5296d06743cb250a042b530d6679"} Mar 13 01:44:54.019173 master-0 kubenswrapper[19170]: I0313 01:44:54.019104 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn8cz\" (UniqueName: \"kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz\") pod \"ovn-controller-gvjkt-config-gdsw7\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:54.033850 master-0 kubenswrapper[19170]: I0313 01:44:54.032574 19170 generic.go:334] "Generic (PLEG): container finished" podID="1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" containerID="512d5d72468e036f9f2e8f8d20562bf6aea54ba7d93639a99ef2a46cceead96c" exitCode=0 Mar 13 01:44:54.033850 master-0 kubenswrapper[19170]: I0313 01:44:54.032646 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ckzdn" event={"ID":"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd","Type":"ContainerDied","Data":"512d5d72468e036f9f2e8f8d20562bf6aea54ba7d93639a99ef2a46cceead96c"} Mar 13 01:44:54.033850 master-0 kubenswrapper[19170]: I0313 01:44:54.032674 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ckzdn" event={"ID":"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd","Type":"ContainerStarted","Data":"167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e"} Mar 13 01:44:54.036876 master-0 kubenswrapper[19170]: I0313 01:44:54.034142 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee1-account-create-update-pr2qk" event={"ID":"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8","Type":"ContainerStarted","Data":"7e994b272070e67c5a0c12dc17ef3fdc575165c61ca7d837d9eee9eb264cd81d"} Mar 13 01:44:54.036876 master-0 kubenswrapper[19170]: I0313 01:44:54.034188 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee1-account-create-update-pr2qk" event={"ID":"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8","Type":"ContainerStarted","Data":"0fa4122b05bf009918ce99aa7683fd4abfd07dbd0e0ea48a52a0a1e2ef8a54de"} Mar 13 01:44:54.061117 master-0 kubenswrapper[19170]: I0313 01:44:54.058703 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-bvpk6" podStartSLOduration=3.048221267 podStartE2EDuration="19.058681699s" podCreationTimestamp="2026-03-13 01:44:35 +0000 UTC" firstStartedPulling="2026-03-13 01:44:36.946526923 +0000 UTC m=+1537.754647883" lastFinishedPulling="2026-03-13 01:44:52.956987355 +0000 UTC m=+1553.765108315" observedRunningTime="2026-03-13 01:44:54.027760868 +0000 UTC m=+1554.835881918" watchObservedRunningTime="2026-03-13 01:44:54.058681699 +0000 UTC m=+1554.866802659" Mar 13 01:44:54.070194 master-0 kubenswrapper[19170]: I0313 01:44:54.067179 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-klmnb" podStartSLOduration=5.067162389 podStartE2EDuration="5.067162389s" podCreationTimestamp="2026-03-13 01:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:54.051525658 +0000 UTC m=+1554.859646618" watchObservedRunningTime="2026-03-13 01:44:54.067162389 +0000 UTC m=+1554.875283349" Mar 13 01:44:54.082962 master-0 kubenswrapper[19170]: I0313 01:44:54.080968 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0ee1-account-create-update-pr2qk" podStartSLOduration=5.080948297 podStartE2EDuration="5.080948297s" podCreationTimestamp="2026-03-13 01:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:54.071139621 +0000 UTC m=+1554.879260581" watchObservedRunningTime="2026-03-13 01:44:54.080948297 +0000 UTC m=+1554.889069257" Mar 13 01:44:54.131665 master-0 kubenswrapper[19170]: I0313 01:44:54.131204 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:54.717406 master-0 kubenswrapper[19170]: I0313 01:44:54.717342 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gvjkt-config-gdsw7"] Mar 13 01:44:55.056834 master-0 kubenswrapper[19170]: I0313 01:44:55.056775 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-gdsw7" event={"ID":"ebe3c7b6-cb9f-426e-b709-4f76c915bd04","Type":"ContainerStarted","Data":"f30e075fdc163bf702368c63564fd8f9b7a7728988ba44dc27a1e1e7797a8b4b"} Mar 13 01:44:55.056834 master-0 kubenswrapper[19170]: I0313 01:44:55.056825 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-gdsw7" event={"ID":"ebe3c7b6-cb9f-426e-b709-4f76c915bd04","Type":"ContainerStarted","Data":"d04359dbf88d3f3e298ce96d632e46b9f83fe676ac1626e5802d04fa68a48dfc"} Mar 13 01:44:55.060837 master-0 kubenswrapper[19170]: I0313 01:44:55.060810 19170 generic.go:334] "Generic (PLEG): container finished" podID="48f21b3f-8e3e-4f83-8262-ff7a97273897" containerID="8b7652cd02c3b227b0ec60387fb04a000423d0c5e46396197f0f375a7afc6cf0" exitCode=0 Mar 13 01:44:55.060912 master-0 kubenswrapper[19170]: I0313 01:44:55.060860 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-klmnb" event={"ID":"48f21b3f-8e3e-4f83-8262-ff7a97273897","Type":"ContainerDied","Data":"8b7652cd02c3b227b0ec60387fb04a000423d0c5e46396197f0f375a7afc6cf0"} Mar 13 01:44:55.067319 master-0 kubenswrapper[19170]: I0313 01:44:55.067277 19170 generic.go:334] "Generic (PLEG): container finished" podID="4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" containerID="7e994b272070e67c5a0c12dc17ef3fdc575165c61ca7d837d9eee9eb264cd81d" exitCode=0 Mar 13 01:44:55.067436 master-0 kubenswrapper[19170]: I0313 01:44:55.067349 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee1-account-create-update-pr2qk" event={"ID":"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8","Type":"ContainerDied","Data":"7e994b272070e67c5a0c12dc17ef3fdc575165c61ca7d837d9eee9eb264cd81d"} Mar 13 01:44:55.072984 master-0 kubenswrapper[19170]: I0313 01:44:55.072944 19170 generic.go:334] "Generic (PLEG): container finished" podID="4516736b-101c-43e8-8492-bda9281e78f9" containerID="b93c304df77e4e0e4ff03828983564899840d2d31708c5197418b66327f25d49" exitCode=0 Mar 13 01:44:55.073448 master-0 kubenswrapper[19170]: I0313 01:44:55.073227 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7877-account-create-update-9ffg6" event={"ID":"4516736b-101c-43e8-8492-bda9281e78f9","Type":"ContainerDied","Data":"b93c304df77e4e0e4ff03828983564899840d2d31708c5197418b66327f25d49"} Mar 13 01:44:55.099946 master-0 kubenswrapper[19170]: I0313 01:44:55.094381 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gvjkt-config-gdsw7" podStartSLOduration=2.094361183 podStartE2EDuration="2.094361183s" podCreationTimestamp="2026-03-13 01:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:44:55.076292864 +0000 UTC m=+1555.884413824" watchObservedRunningTime="2026-03-13 01:44:55.094361183 +0000 UTC m=+1555.902482143" Mar 13 01:44:55.431756 master-0 kubenswrapper[19170]: I0313 01:44:55.431705 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc119d6-19fa-4bb5-9989-8f759c6daefc" path="/var/lib/kubelet/pods/6dc119d6-19fa-4bb5-9989-8f759c6daefc/volumes" Mar 13 01:44:55.531565 master-0 kubenswrapper[19170]: I0313 01:44:55.531527 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:55.657914 master-0 kubenswrapper[19170]: I0313 01:44:55.657607 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bf5z\" (UniqueName: \"kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z\") pod \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " Mar 13 01:44:55.657914 master-0 kubenswrapper[19170]: I0313 01:44:55.657720 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts\") pod \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\" (UID: \"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd\") " Mar 13 01:44:55.658390 master-0 kubenswrapper[19170]: I0313 01:44:55.658269 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" (UID: "1cbf04b2-2903-4475-aaf3-74cb9d85e3bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:55.658607 master-0 kubenswrapper[19170]: I0313 01:44:55.658585 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:55.660818 master-0 kubenswrapper[19170]: I0313 01:44:55.660751 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z" (OuterVolumeSpecName: "kube-api-access-5bf5z") pod "1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" (UID: "1cbf04b2-2903-4475-aaf3-74cb9d85e3bd"). InnerVolumeSpecName "kube-api-access-5bf5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:55.760865 master-0 kubenswrapper[19170]: I0313 01:44:55.760628 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bf5z\" (UniqueName: \"kubernetes.io/projected/1cbf04b2-2903-4475-aaf3-74cb9d85e3bd-kube-api-access-5bf5z\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:56.085692 master-0 kubenswrapper[19170]: I0313 01:44:56.085619 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ckzdn" event={"ID":"1cbf04b2-2903-4475-aaf3-74cb9d85e3bd","Type":"ContainerDied","Data":"167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e"} Mar 13 01:44:56.085692 master-0 kubenswrapper[19170]: I0313 01:44:56.085684 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167e743ccede7f09c5bd22ab1ed22b18803843f674e7b19fe64b7bddfdb9446e" Mar 13 01:44:56.085692 master-0 kubenswrapper[19170]: I0313 01:44:56.085651 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ckzdn" Mar 13 01:44:56.087646 master-0 kubenswrapper[19170]: I0313 01:44:56.087544 19170 generic.go:334] "Generic (PLEG): container finished" podID="ebe3c7b6-cb9f-426e-b709-4f76c915bd04" containerID="f30e075fdc163bf702368c63564fd8f9b7a7728988ba44dc27a1e1e7797a8b4b" exitCode=0 Mar 13 01:44:56.088207 master-0 kubenswrapper[19170]: I0313 01:44:56.088172 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-gdsw7" event={"ID":"ebe3c7b6-cb9f-426e-b709-4f76c915bd04","Type":"ContainerDied","Data":"f30e075fdc163bf702368c63564fd8f9b7a7728988ba44dc27a1e1e7797a8b4b"} Mar 13 01:44:57.547797 master-0 kubenswrapper[19170]: I0313 01:44:57.546915 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 01:44:59.327754 master-0 kubenswrapper[19170]: I0313 01:44:59.327687 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:44:59.340163 master-0 kubenswrapper[19170]: I0313 01:44:59.340102 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-klmnb" Mar 13 01:44:59.349202 master-0 kubenswrapper[19170]: I0313 01:44:59.349134 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:44:59.356781 master-0 kubenswrapper[19170]: I0313 01:44:59.356737 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:44:59.465601 master-0 kubenswrapper[19170]: I0313 01:44:59.465424 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74skk\" (UniqueName: \"kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk\") pod \"48f21b3f-8e3e-4f83-8262-ff7a97273897\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " Mar 13 01:44:59.465601 master-0 kubenswrapper[19170]: I0313 01:44:59.465543 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q57g8\" (UniqueName: \"kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8\") pod \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " Mar 13 01:44:59.465601 master-0 kubenswrapper[19170]: I0313 01:44:59.465585 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.465931 master-0 kubenswrapper[19170]: I0313 01:44:59.465763 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn8cz\" (UniqueName: \"kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.465931 master-0 kubenswrapper[19170]: I0313 01:44:59.465796 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.465931 master-0 kubenswrapper[19170]: I0313 01:44:59.465822 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.465931 master-0 kubenswrapper[19170]: I0313 01:44:59.465895 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts\") pod \"4516736b-101c-43e8-8492-bda9281e78f9\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " Mar 13 01:44:59.465931 master-0 kubenswrapper[19170]: I0313 01:44:59.465923 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.466160 master-0 kubenswrapper[19170]: I0313 01:44:59.465997 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts\") pod \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\" (UID: \"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8\") " Mar 13 01:44:59.466160 master-0 kubenswrapper[19170]: I0313 01:44:59.466058 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts\") pod \"48f21b3f-8e3e-4f83-8262-ff7a97273897\" (UID: \"48f21b3f-8e3e-4f83-8262-ff7a97273897\") " Mar 13 01:44:59.466160 master-0 kubenswrapper[19170]: I0313 01:44:59.466081 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6brlg\" (UniqueName: \"kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg\") pod \"4516736b-101c-43e8-8492-bda9281e78f9\" (UID: \"4516736b-101c-43e8-8492-bda9281e78f9\") " Mar 13 01:44:59.466160 master-0 kubenswrapper[19170]: I0313 01:44:59.466113 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts\") pod \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\" (UID: \"ebe3c7b6-cb9f-426e-b709-4f76c915bd04\") " Mar 13 01:44:59.468564 master-0 kubenswrapper[19170]: I0313 01:44:59.468528 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:59.469576 master-0 kubenswrapper[19170]: I0313 01:44:59.468801 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:59.471704 master-0 kubenswrapper[19170]: I0313 01:44:59.471481 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:59.471704 master-0 kubenswrapper[19170]: I0313 01:44:59.471494 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4516736b-101c-43e8-8492-bda9281e78f9" (UID: "4516736b-101c-43e8-8492-bda9281e78f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:59.471704 master-0 kubenswrapper[19170]: I0313 01:44:59.471485 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" (UID: "4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:59.471879 master-0 kubenswrapper[19170]: I0313 01:44:59.471688 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run" (OuterVolumeSpecName: "var-run") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:44:59.471879 master-0 kubenswrapper[19170]: I0313 01:44:59.471737 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48f21b3f-8e3e-4f83-8262-ff7a97273897" (UID: "48f21b3f-8e3e-4f83-8262-ff7a97273897"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:59.471968 master-0 kubenswrapper[19170]: I0313 01:44:59.471937 19170 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472013 master-0 kubenswrapper[19170]: I0313 01:44:59.471961 19170 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472013 master-0 kubenswrapper[19170]: I0313 01:44:59.471980 19170 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472013 master-0 kubenswrapper[19170]: I0313 01:44:59.471994 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4516736b-101c-43e8-8492-bda9281e78f9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472013 master-0 kubenswrapper[19170]: I0313 01:44:59.472008 19170 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-var-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472508 master-0 kubenswrapper[19170]: I0313 01:44:59.472020 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472508 master-0 kubenswrapper[19170]: I0313 01:44:59.472033 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f21b3f-8e3e-4f83-8262-ff7a97273897-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.472508 master-0 kubenswrapper[19170]: I0313 01:44:59.471958 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8" (OuterVolumeSpecName: "kube-api-access-q57g8") pod "4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" (UID: "4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8"). InnerVolumeSpecName "kube-api-access-q57g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:59.472880 master-0 kubenswrapper[19170]: I0313 01:44:59.472834 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts" (OuterVolumeSpecName: "scripts") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:44:59.473088 master-0 kubenswrapper[19170]: I0313 01:44:59.473051 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk" (OuterVolumeSpecName: "kube-api-access-74skk") pod "48f21b3f-8e3e-4f83-8262-ff7a97273897" (UID: "48f21b3f-8e3e-4f83-8262-ff7a97273897"). InnerVolumeSpecName "kube-api-access-74skk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:59.474693 master-0 kubenswrapper[19170]: I0313 01:44:59.474656 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg" (OuterVolumeSpecName: "kube-api-access-6brlg") pod "4516736b-101c-43e8-8492-bda9281e78f9" (UID: "4516736b-101c-43e8-8492-bda9281e78f9"). InnerVolumeSpecName "kube-api-access-6brlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:59.476273 master-0 kubenswrapper[19170]: I0313 01:44:59.476220 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz" (OuterVolumeSpecName: "kube-api-access-nn8cz") pod "ebe3c7b6-cb9f-426e-b709-4f76c915bd04" (UID: "ebe3c7b6-cb9f-426e-b709-4f76c915bd04"). InnerVolumeSpecName "kube-api-access-nn8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:44:59.574762 master-0 kubenswrapper[19170]: I0313 01:44:59.574713 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6brlg\" (UniqueName: \"kubernetes.io/projected/4516736b-101c-43e8-8492-bda9281e78f9-kube-api-access-6brlg\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.574762 master-0 kubenswrapper[19170]: I0313 01:44:59.574754 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.574762 master-0 kubenswrapper[19170]: I0313 01:44:59.574766 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74skk\" (UniqueName: \"kubernetes.io/projected/48f21b3f-8e3e-4f83-8262-ff7a97273897-kube-api-access-74skk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.574986 master-0 kubenswrapper[19170]: I0313 01:44:59.574777 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q57g8\" (UniqueName: \"kubernetes.io/projected/4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8-kube-api-access-q57g8\") on node \"master-0\" DevicePath \"\"" Mar 13 01:44:59.574986 master-0 kubenswrapper[19170]: I0313 01:44:59.574787 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn8cz\" (UniqueName: \"kubernetes.io/projected/ebe3c7b6-cb9f-426e-b709-4f76c915bd04-kube-api-access-nn8cz\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:00.145736 master-0 kubenswrapper[19170]: I0313 01:45:00.145368 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0ee1-account-create-update-pr2qk" event={"ID":"4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8","Type":"ContainerDied","Data":"0fa4122b05bf009918ce99aa7683fd4abfd07dbd0e0ea48a52a0a1e2ef8a54de"} Mar 13 01:45:00.145736 master-0 kubenswrapper[19170]: I0313 01:45:00.145480 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fa4122b05bf009918ce99aa7683fd4abfd07dbd0e0ea48a52a0a1e2ef8a54de" Mar 13 01:45:00.145736 master-0 kubenswrapper[19170]: I0313 01:45:00.145411 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0ee1-account-create-update-pr2qk" Mar 13 01:45:00.148797 master-0 kubenswrapper[19170]: I0313 01:45:00.147966 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7877-account-create-update-9ffg6" event={"ID":"4516736b-101c-43e8-8492-bda9281e78f9","Type":"ContainerDied","Data":"f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd"} Mar 13 01:45:00.148797 master-0 kubenswrapper[19170]: I0313 01:45:00.148013 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f855fe2fc4bf7c326ba4e9802587259435529238229180623f8d32fa4f836ffd" Mar 13 01:45:00.148797 master-0 kubenswrapper[19170]: I0313 01:45:00.148027 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7877-account-create-update-9ffg6" Mar 13 01:45:00.149492 master-0 kubenswrapper[19170]: I0313 01:45:00.149416 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-snmt2" event={"ID":"1d6788e1-16bb-4b4a-bb65-9e53915b9e74","Type":"ContainerStarted","Data":"1177b6010a5ae975fe92e72a6611c240255fda2ed05486427e58716bec304676"} Mar 13 01:45:00.154829 master-0 kubenswrapper[19170]: I0313 01:45:00.154784 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gvjkt-config-gdsw7" Mar 13 01:45:00.155232 master-0 kubenswrapper[19170]: I0313 01:45:00.155152 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gvjkt-config-gdsw7" event={"ID":"ebe3c7b6-cb9f-426e-b709-4f76c915bd04","Type":"ContainerDied","Data":"d04359dbf88d3f3e298ce96d632e46b9f83fe676ac1626e5802d04fa68a48dfc"} Mar 13 01:45:00.155232 master-0 kubenswrapper[19170]: I0313 01:45:00.155227 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d04359dbf88d3f3e298ce96d632e46b9f83fe676ac1626e5802d04fa68a48dfc" Mar 13 01:45:00.159021 master-0 kubenswrapper[19170]: I0313 01:45:00.158954 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-klmnb" event={"ID":"48f21b3f-8e3e-4f83-8262-ff7a97273897","Type":"ContainerDied","Data":"feb9bd0ca69b0ec172a9ddb5b522a858972b5296d06743cb250a042b530d6679"} Mar 13 01:45:00.159021 master-0 kubenswrapper[19170]: I0313 01:45:00.159011 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb9bd0ca69b0ec172a9ddb5b522a858972b5296d06743cb250a042b530d6679" Mar 13 01:45:00.159219 master-0 kubenswrapper[19170]: I0313 01:45:00.159109 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-klmnb" Mar 13 01:45:00.175986 master-0 kubenswrapper[19170]: I0313 01:45:00.175846 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-snmt2" podStartSLOduration=5.219975467 podStartE2EDuration="11.175821496s" podCreationTimestamp="2026-03-13 01:44:49 +0000 UTC" firstStartedPulling="2026-03-13 01:44:53.844163103 +0000 UTC m=+1554.652284063" lastFinishedPulling="2026-03-13 01:44:59.800009132 +0000 UTC m=+1560.608130092" observedRunningTime="2026-03-13 01:45:00.170691461 +0000 UTC m=+1560.978812441" watchObservedRunningTime="2026-03-13 01:45:00.175821496 +0000 UTC m=+1560.983942476" Mar 13 01:45:00.812984 master-0 kubenswrapper[19170]: I0313 01:45:00.812933 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gvjkt-config-gdsw7"] Mar 13 01:45:00.825432 master-0 kubenswrapper[19170]: I0313 01:45:00.825251 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gvjkt-config-gdsw7"] Mar 13 01:45:01.434287 master-0 kubenswrapper[19170]: I0313 01:45:01.434233 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebe3c7b6-cb9f-426e-b709-4f76c915bd04" path="/var/lib/kubelet/pods/ebe3c7b6-cb9f-426e-b709-4f76c915bd04/volumes" Mar 13 01:45:04.212006 master-0 kubenswrapper[19170]: I0313 01:45:04.211940 19170 generic.go:334] "Generic (PLEG): container finished" podID="1d6788e1-16bb-4b4a-bb65-9e53915b9e74" containerID="1177b6010a5ae975fe92e72a6611c240255fda2ed05486427e58716bec304676" exitCode=0 Mar 13 01:45:04.212499 master-0 kubenswrapper[19170]: I0313 01:45:04.212011 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-snmt2" event={"ID":"1d6788e1-16bb-4b4a-bb65-9e53915b9e74","Type":"ContainerDied","Data":"1177b6010a5ae975fe92e72a6611c240255fda2ed05486427e58716bec304676"} Mar 13 01:45:05.226208 master-0 kubenswrapper[19170]: I0313 01:45:05.226029 19170 generic.go:334] "Generic (PLEG): container finished" podID="534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" containerID="0e0f1c1b9a2e40bbc6b0ab512acedb270840f7d4aa0034ff33c77d8b0162c860" exitCode=0 Mar 13 01:45:05.226208 master-0 kubenswrapper[19170]: I0313 01:45:05.226118 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bvpk6" event={"ID":"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a","Type":"ContainerDied","Data":"0e0f1c1b9a2e40bbc6b0ab512acedb270840f7d4aa0034ff33c77d8b0162c860"} Mar 13 01:45:05.647317 master-0 kubenswrapper[19170]: I0313 01:45:05.647275 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-snmt2" Mar 13 01:45:05.750189 master-0 kubenswrapper[19170]: I0313 01:45:05.749391 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle\") pod \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " Mar 13 01:45:05.750189 master-0 kubenswrapper[19170]: I0313 01:45:05.749474 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data\") pod \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " Mar 13 01:45:05.750189 master-0 kubenswrapper[19170]: I0313 01:45:05.749623 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8vc\" (UniqueName: \"kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc\") pod \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\" (UID: \"1d6788e1-16bb-4b4a-bb65-9e53915b9e74\") " Mar 13 01:45:05.752924 master-0 kubenswrapper[19170]: I0313 01:45:05.752872 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc" (OuterVolumeSpecName: "kube-api-access-zd8vc") pod "1d6788e1-16bb-4b4a-bb65-9e53915b9e74" (UID: "1d6788e1-16bb-4b4a-bb65-9e53915b9e74"). InnerVolumeSpecName "kube-api-access-zd8vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:05.789343 master-0 kubenswrapper[19170]: I0313 01:45:05.789234 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d6788e1-16bb-4b4a-bb65-9e53915b9e74" (UID: "1d6788e1-16bb-4b4a-bb65-9e53915b9e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:05.834367 master-0 kubenswrapper[19170]: I0313 01:45:05.826758 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data" (OuterVolumeSpecName: "config-data") pod "1d6788e1-16bb-4b4a-bb65-9e53915b9e74" (UID: "1d6788e1-16bb-4b4a-bb65-9e53915b9e74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:05.856658 master-0 kubenswrapper[19170]: I0313 01:45:05.855982 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:05.856658 master-0 kubenswrapper[19170]: I0313 01:45:05.856034 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:05.856658 master-0 kubenswrapper[19170]: I0313 01:45:05.856045 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd8vc\" (UniqueName: \"kubernetes.io/projected/1d6788e1-16bb-4b4a-bb65-9e53915b9e74-kube-api-access-zd8vc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:06.253875 master-0 kubenswrapper[19170]: I0313 01:45:06.251813 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-snmt2" event={"ID":"1d6788e1-16bb-4b4a-bb65-9e53915b9e74","Type":"ContainerDied","Data":"5ccb6c9e39b5dfe14096dda4e0e9b78f7d08c69a31d0fcf82759a02704b8c46f"} Mar 13 01:45:06.253875 master-0 kubenswrapper[19170]: I0313 01:45:06.251885 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccb6c9e39b5dfe14096dda4e0e9b78f7d08c69a31d0fcf82759a02704b8c46f" Mar 13 01:45:06.253875 master-0 kubenswrapper[19170]: I0313 01:45:06.251850 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-snmt2" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.616460 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617464 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617481 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617515 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6788e1-16bb-4b4a-bb65-9e53915b9e74" containerName="keystone-db-sync" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617521 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6788e1-16bb-4b4a-bb65-9e53915b9e74" containerName="keystone-db-sync" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617547 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617554 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617571 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f21b3f-8e3e-4f83-8262-ff7a97273897" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617577 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f21b3f-8e3e-4f83-8262-ff7a97273897" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617591 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4516736b-101c-43e8-8492-bda9281e78f9" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617598 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4516736b-101c-43e8-8492-bda9281e78f9" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: E0313 01:45:06.617631 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebe3c7b6-cb9f-426e-b709-4f76c915bd04" containerName="ovn-config" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.617651 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebe3c7b6-cb9f-426e-b709-4f76c915bd04" containerName="ovn-config" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618035 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f21b3f-8e3e-4f83-8262-ff7a97273897" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618082 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbf04b2-2903-4475-aaf3-74cb9d85e3bd" containerName="mariadb-database-create" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618100 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6788e1-16bb-4b4a-bb65-9e53915b9e74" containerName="keystone-db-sync" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618111 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4516736b-101c-43e8-8492-bda9281e78f9" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618129 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4042a1c0-8ce2-4e1d-b116-eef4e4ebb4c8" containerName="mariadb-account-create-update" Mar 13 01:45:06.619735 master-0 kubenswrapper[19170]: I0313 01:45:06.618149 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebe3c7b6-cb9f-426e-b709-4f76c915bd04" containerName="ovn-config" Mar 13 01:45:06.630790 master-0 kubenswrapper[19170]: I0313 01:45:06.627142 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.713983 master-0 kubenswrapper[19170]: I0313 01:45:06.713816 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hq7k2"] Mar 13 01:45:06.716925 master-0 kubenswrapper[19170]: I0313 01:45:06.716894 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.721881 master-0 kubenswrapper[19170]: I0313 01:45:06.721845 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 01:45:06.728699 master-0 kubenswrapper[19170]: I0313 01:45:06.728661 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 01:45:06.728899 master-0 kubenswrapper[19170]: I0313 01:45:06.728883 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 01:45:06.729015 master-0 kubenswrapper[19170]: I0313 01:45:06.728999 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 01:45:06.758209 master-0 kubenswrapper[19170]: I0313 01:45:06.758141 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:06.758209 master-0 kubenswrapper[19170]: I0313 01:45:06.758212 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hq7k2"] Mar 13 01:45:06.791727 master-0 kubenswrapper[19170]: I0313 01:45:06.789795 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-wmh5p"] Mar 13 01:45:06.791727 master-0 kubenswrapper[19170]: I0313 01:45:06.791239 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:06.801667 master-0 kubenswrapper[19170]: I0313 01:45:06.800712 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-wmh5p"] Mar 13 01:45:06.813404 master-0 kubenswrapper[19170]: I0313 01:45:06.813360 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.813762 master-0 kubenswrapper[19170]: I0313 01:45:06.813743 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.814816 master-0 kubenswrapper[19170]: I0313 01:45:06.814799 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.814928 master-0 kubenswrapper[19170]: I0313 01:45:06.814914 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.815018 master-0 kubenswrapper[19170]: I0313 01:45:06.815004 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.815141 master-0 kubenswrapper[19170]: I0313 01:45:06.815128 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsjj\" (UniqueName: \"kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.881701 master-0 kubenswrapper[19170]: I0313 01:45:06.881532 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hmwsf"] Mar 13 01:45:06.883409 master-0 kubenswrapper[19170]: I0313 01:45:06.882909 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:06.894144 master-0 kubenswrapper[19170]: I0313 01:45:06.894092 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 01:45:06.894376 master-0 kubenswrapper[19170]: I0313 01:45:06.894300 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 01:45:06.894757 master-0 kubenswrapper[19170]: I0313 01:45:06.894731 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7dab-account-create-update-w8wtd"] Mar 13 01:45:06.896300 master-0 kubenswrapper[19170]: I0313 01:45:06.896273 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:06.907198 master-0 kubenswrapper[19170]: I0313 01:45:06.907156 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926323 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926373 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926421 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926500 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtvs\" (UniqueName: \"kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926539 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926582 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926607 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926627 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7lj9\" (UniqueName: \"kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926679 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926707 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx595\" (UniqueName: \"kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926750 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926785 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsjj\" (UniqueName: \"kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926827 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926849 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.926872 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcb4q\" (UniqueName: \"kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.928259 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.928292 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.928312 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.928333 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.931538 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.932907 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.933009 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.933454 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.933469 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.936437 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-db-sync-5fc74"] Mar 13 01:45:06.941159 master-0 kubenswrapper[19170]: I0313 01:45:06.938334 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:06.944281 master-0 kubenswrapper[19170]: I0313 01:45:06.941781 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-config-data" Mar 13 01:45:06.944281 master-0 kubenswrapper[19170]: I0313 01:45:06.941960 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-scripts" Mar 13 01:45:06.961997 master-0 kubenswrapper[19170]: I0313 01:45:06.961449 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsjj\" (UniqueName: \"kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj\") pod \"dnsmasq-dns-56b5f74d97-8m7sq\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:06.978077 master-0 kubenswrapper[19170]: I0313 01:45:06.975810 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:07.006516 master-0 kubenswrapper[19170]: I0313 01:45:07.005919 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7dab-account-create-update-w8wtd"] Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058173 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058241 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058271 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058304 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcb4q\" (UniqueName: \"kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058428 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058485 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058513 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058535 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058589 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgd56\" (UniqueName: \"kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058663 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058694 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058719 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058789 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtvs\" (UniqueName: \"kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058826 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058859 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7lj9\" (UniqueName: \"kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058891 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058941 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx595\" (UniqueName: \"kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.060465 master-0 kubenswrapper[19170]: I0313 01:45:07.058967 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.078785 master-0 kubenswrapper[19170]: I0313 01:45:07.078708 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hmwsf"] Mar 13 01:45:07.084166 master-0 kubenswrapper[19170]: I0313 01:45:07.083389 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:07.084166 master-0 kubenswrapper[19170]: I0313 01:45:07.083932 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:07.084166 master-0 kubenswrapper[19170]: I0313 01:45:07.083966 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.084399 master-0 kubenswrapper[19170]: I0313 01:45:07.084203 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.085032 master-0 kubenswrapper[19170]: I0313 01:45:07.084576 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.091271 master-0 kubenswrapper[19170]: I0313 01:45:07.091233 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-db-sync-5fc74"] Mar 13 01:45:07.105747 master-0 kubenswrapper[19170]: I0313 01:45:07.093038 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.105747 master-0 kubenswrapper[19170]: I0313 01:45:07.093332 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.105747 master-0 kubenswrapper[19170]: I0313 01:45:07.093530 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.105747 master-0 kubenswrapper[19170]: I0313 01:45:07.094040 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.130712 master-0 kubenswrapper[19170]: I0313 01:45:07.129791 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcb4q\" (UniqueName: \"kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q\") pod \"neutron-db-sync-hmwsf\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.136468 master-0 kubenswrapper[19170]: I0313 01:45:07.136081 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7lj9\" (UniqueName: \"kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9\") pod \"ironic-7dab-account-create-update-w8wtd\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:07.138237 master-0 kubenswrapper[19170]: I0313 01:45:07.138198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx595\" (UniqueName: \"kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595\") pod \"keystone-bootstrap-hq7k2\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.139197 master-0 kubenswrapper[19170]: I0313 01:45:07.139021 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mmsc6"] Mar 13 01:45:07.139670 master-0 kubenswrapper[19170]: I0313 01:45:07.139644 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjtvs\" (UniqueName: \"kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs\") pod \"ironic-db-create-wmh5p\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:07.142273 master-0 kubenswrapper[19170]: I0313 01:45:07.142161 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.144084 master-0 kubenswrapper[19170]: I0313 01:45:07.144066 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 01:45:07.144396 master-0 kubenswrapper[19170]: I0313 01:45:07.144379 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 01:45:07.151182 master-0 kubenswrapper[19170]: I0313 01:45:07.149477 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.153854 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bvpk6" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162192 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162271 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162297 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162356 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162373 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.162424 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgd56\" (UniqueName: \"kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.164241 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.171952 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.176687 master-0 kubenswrapper[19170]: I0313 01:45:07.174137 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mmsc6"] Mar 13 01:45:07.180355 master-0 kubenswrapper[19170]: I0313 01:45:07.179832 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.191120 master-0 kubenswrapper[19170]: I0313 01:45:07.189890 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:07.191567 master-0 kubenswrapper[19170]: I0313 01:45:07.191522 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.196701 master-0 kubenswrapper[19170]: I0313 01:45:07.196663 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgd56\" (UniqueName: \"kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.201410 master-0 kubenswrapper[19170]: I0313 01:45:07.198896 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data\") pod \"cinder-051b7-db-sync-5fc74\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.201410 master-0 kubenswrapper[19170]: I0313 01:45:07.200502 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:07.210267 master-0 kubenswrapper[19170]: E0313 01:45:07.202766 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" containerName="glance-db-sync" Mar 13 01:45:07.210267 master-0 kubenswrapper[19170]: I0313 01:45:07.202789 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" containerName="glance-db-sync" Mar 13 01:45:07.210267 master-0 kubenswrapper[19170]: I0313 01:45:07.203013 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" containerName="glance-db-sync" Mar 13 01:45:07.210267 master-0 kubenswrapper[19170]: I0313 01:45:07.204114 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.228577 master-0 kubenswrapper[19170]: I0313 01:45:07.221879 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:07.266178 master-0 kubenswrapper[19170]: I0313 01:45:07.266109 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle\") pod \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " Mar 13 01:45:07.266178 master-0 kubenswrapper[19170]: I0313 01:45:07.266148 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data\") pod \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " Mar 13 01:45:07.266624 master-0 kubenswrapper[19170]: I0313 01:45:07.266371 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data\") pod \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " Mar 13 01:45:07.266624 master-0 kubenswrapper[19170]: I0313 01:45:07.266421 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjr7x\" (UniqueName: \"kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x\") pod \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\" (UID: \"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a\") " Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267348 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267406 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267460 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267495 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267541 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.267726 master-0 kubenswrapper[19170]: I0313 01:45:07.267577 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2kqs\" (UniqueName: \"kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.269814 master-0 kubenswrapper[19170]: I0313 01:45:07.268042 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.269814 master-0 kubenswrapper[19170]: I0313 01:45:07.268117 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.269814 master-0 kubenswrapper[19170]: I0313 01:45:07.268148 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwz2g\" (UniqueName: \"kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.269814 master-0 kubenswrapper[19170]: I0313 01:45:07.268166 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.269814 master-0 kubenswrapper[19170]: I0313 01:45:07.268247 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.274188 master-0 kubenswrapper[19170]: I0313 01:45:07.274142 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bvpk6" event={"ID":"534d7c6b-ec6b-48e0-8ace-0d893ae9da6a","Type":"ContainerDied","Data":"04c2b01ab38012d6fd32707d0e449e3b8bd4c9ebf44b79faf7eabc0bcdecc26c"} Mar 13 01:45:07.274188 master-0 kubenswrapper[19170]: I0313 01:45:07.274180 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c2b01ab38012d6fd32707d0e449e3b8bd4c9ebf44b79faf7eabc0bcdecc26c" Mar 13 01:45:07.274291 master-0 kubenswrapper[19170]: I0313 01:45:07.274231 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bvpk6" Mar 13 01:45:07.285557 master-0 kubenswrapper[19170]: I0313 01:45:07.284936 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x" (OuterVolumeSpecName: "kube-api-access-mjr7x") pod "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" (UID: "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a"). InnerVolumeSpecName "kube-api-access-mjr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:07.335137 master-0 kubenswrapper[19170]: I0313 01:45:07.330421 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.410566 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" (UID: "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412088 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412179 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwz2g\" (UniqueName: \"kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412216 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412384 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412464 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412497 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412547 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412577 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412651 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412691 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2kqs\" (UniqueName: \"kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412757 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412807 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412960 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjr7x\" (UniqueName: \"kubernetes.io/projected/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-kube-api-access-mjr7x\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:07.413040 master-0 kubenswrapper[19170]: I0313 01:45:07.412980 19170 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:07.416302 master-0 kubenswrapper[19170]: I0313 01:45:07.413706 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" (UID: "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:07.416302 master-0 kubenswrapper[19170]: I0313 01:45:07.414136 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data" (OuterVolumeSpecName: "config-data") pod "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a" (UID: "534d7c6b-ec6b-48e0-8ace-0d893ae9da6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:07.416302 master-0 kubenswrapper[19170]: I0313 01:45:07.414370 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.416302 master-0 kubenswrapper[19170]: I0313 01:45:07.415081 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.416302 master-0 kubenswrapper[19170]: I0313 01:45:07.415745 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.422510 master-0 kubenswrapper[19170]: I0313 01:45:07.418231 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.427665 master-0 kubenswrapper[19170]: I0313 01:45:07.427599 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:07.429901 master-0 kubenswrapper[19170]: I0313 01:45:07.428323 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.429901 master-0 kubenswrapper[19170]: I0313 01:45:07.428545 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.435936 master-0 kubenswrapper[19170]: I0313 01:45:07.435614 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwz2g\" (UniqueName: \"kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.436211 master-0 kubenswrapper[19170]: I0313 01:45:07.436189 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts\") pod \"placement-db-sync-mmsc6\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.438447 master-0 kubenswrapper[19170]: I0313 01:45:07.438399 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2kqs\" (UniqueName: \"kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.454706 master-0 kubenswrapper[19170]: I0313 01:45:07.454581 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc\") pod \"dnsmasq-dns-5fbdc4df5f-jkj9p\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.456033 master-0 kubenswrapper[19170]: I0313 01:45:07.455714 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:07.467665 master-0 kubenswrapper[19170]: I0313 01:45:07.466675 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:07.496725 master-0 kubenswrapper[19170]: I0313 01:45:07.495169 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:07.541811 master-0 kubenswrapper[19170]: I0313 01:45:07.541766 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:07.544197 master-0 kubenswrapper[19170]: I0313 01:45:07.544106 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:07.544197 master-0 kubenswrapper[19170]: I0313 01:45:07.544144 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/534d7c6b-ec6b-48e0-8ace-0d893ae9da6a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:07.635657 master-0 kubenswrapper[19170]: E0313 01:45:07.632907 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod534d7c6b_ec6b_48e0_8ace_0d893ae9da6a.slice\": RecentStats: unable to find data in memory cache]" Mar 13 01:45:07.669359 master-0 kubenswrapper[19170]: I0313 01:45:07.669304 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:07.795445 master-0 kubenswrapper[19170]: I0313 01:45:07.794909 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7dab-account-create-update-w8wtd"] Mar 13 01:45:07.871412 master-0 kubenswrapper[19170]: I0313 01:45:07.870461 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:07.897807 master-0 kubenswrapper[19170]: I0313 01:45:07.897746 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:45:07.900152 master-0 kubenswrapper[19170]: I0313 01:45:07.900024 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:07.941928 master-0 kubenswrapper[19170]: I0313 01:45:07.941339 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078006 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078101 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078125 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078247 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078298 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45br\" (UniqueName: \"kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.083539 master-0 kubenswrapper[19170]: I0313 01:45:08.078324 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.180259 master-0 kubenswrapper[19170]: I0313 01:45:08.180188 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.180259 master-0 kubenswrapper[19170]: I0313 01:45:08.180257 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.181096 master-0 kubenswrapper[19170]: I0313 01:45:08.180357 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.181096 master-0 kubenswrapper[19170]: I0313 01:45:08.180419 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d45br\" (UniqueName: \"kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.181096 master-0 kubenswrapper[19170]: I0313 01:45:08.180443 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.181940 master-0 kubenswrapper[19170]: I0313 01:45:08.181613 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.182283 master-0 kubenswrapper[19170]: I0313 01:45:08.180879 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.182462 master-0 kubenswrapper[19170]: I0313 01:45:08.182394 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.182864 master-0 kubenswrapper[19170]: I0313 01:45:08.182768 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.182923 master-0 kubenswrapper[19170]: I0313 01:45:08.182896 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.183120 master-0 kubenswrapper[19170]: I0313 01:45:08.183052 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.310252 master-0 kubenswrapper[19170]: I0313 01:45:08.310198 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" event={"ID":"26e2c508-1828-4edc-b60e-eea9598f292e","Type":"ContainerStarted","Data":"e77800cde0b38d338b79526b48f64dbd8c3571019a5d96708b627fa0487b1616"} Mar 13 01:45:08.310252 master-0 kubenswrapper[19170]: I0313 01:45:08.310248 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" event={"ID":"26e2c508-1828-4edc-b60e-eea9598f292e","Type":"ContainerStarted","Data":"d750ef6752a1c178e3c374cc53b34c2505649e39c5ea8d4a43b5fc1bf319170b"} Mar 13 01:45:08.312659 master-0 kubenswrapper[19170]: I0313 01:45:08.312620 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dab-account-create-update-w8wtd" event={"ID":"2b2398a0-76d0-4e91-abf0-47f6d43feb12","Type":"ContainerStarted","Data":"ce65246a40ac561f8cc14e142f9082c0faa6caef54523b5749b53610785bbd4c"} Mar 13 01:45:08.312705 master-0 kubenswrapper[19170]: I0313 01:45:08.312665 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dab-account-create-update-w8wtd" event={"ID":"2b2398a0-76d0-4e91-abf0-47f6d43feb12","Type":"ContainerStarted","Data":"8cfb1f79f6b3d33e3bffb7e1707dfe6f92544578db5f43b445388519919d9e4e"} Mar 13 01:45:08.314268 master-0 kubenswrapper[19170]: W0313 01:45:08.314115 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe1a1216_7989_411d_8b62_6d12fefcc8ae.slice/crio-d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942 WatchSource:0}: Error finding container d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942: Status 404 returned error can't find the container with id d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942 Mar 13 01:45:08.321700 master-0 kubenswrapper[19170]: I0313 01:45:08.318149 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hmwsf"] Mar 13 01:45:08.404472 master-0 kubenswrapper[19170]: I0313 01:45:08.404361 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45br\" (UniqueName: \"kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br\") pod \"dnsmasq-dns-56c5578c7c-7p22w\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.431169 master-0 kubenswrapper[19170]: I0313 01:45:08.430406 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:08.606477 master-0 kubenswrapper[19170]: I0313 01:45:08.606440 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hq7k2"] Mar 13 01:45:08.664962 master-0 kubenswrapper[19170]: I0313 01:45:08.664827 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mmsc6"] Mar 13 01:45:08.676910 master-0 kubenswrapper[19170]: I0313 01:45:08.676845 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-wmh5p"] Mar 13 01:45:08.688976 master-0 kubenswrapper[19170]: I0313 01:45:08.688086 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:08.699684 master-0 kubenswrapper[19170]: W0313 01:45:08.689974 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod881f7f20_50c9_4b96_b0a9_a14578bd3b6a.slice/crio-5691717671dfd567adad8a4888951adf10a26b714dcde264d081299cc7542f4c WatchSource:0}: Error finding container 5691717671dfd567adad8a4888951adf10a26b714dcde264d081299cc7542f4c: Status 404 returned error can't find the container with id 5691717671dfd567adad8a4888951adf10a26b714dcde264d081299cc7542f4c Mar 13 01:45:08.857414 master-0 kubenswrapper[19170]: I0313 01:45:08.857312 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-db-sync-5fc74"] Mar 13 01:45:08.866141 master-0 kubenswrapper[19170]: W0313 01:45:08.866103 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dbc43a0_77ee_41aa_87b2_730586a6fae4.slice/crio-bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11 WatchSource:0}: Error finding container bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11: Status 404 returned error can't find the container with id bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11 Mar 13 01:45:09.044049 master-0 kubenswrapper[19170]: I0313 01:45:09.039293 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:45:09.339750 master-0 kubenswrapper[19170]: I0313 01:45:09.338343 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mmsc6" event={"ID":"e445697b-863d-4e96-8ccb-581dacecddcd","Type":"ContainerStarted","Data":"fffe11b304c370c5ab28ef903f2fa890b532d8e5e9a3e334c80a139a6a1b3721"} Mar 13 01:45:09.342357 master-0 kubenswrapper[19170]: I0313 01:45:09.341757 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hq7k2" event={"ID":"8b60380b-c057-44ba-a4fa-e88d67bde60a","Type":"ContainerStarted","Data":"a64028f80ad60b57b1351892013ed1c1fbf2f482c5391221d690745b5adcc42a"} Mar 13 01:45:09.342357 master-0 kubenswrapper[19170]: I0313 01:45:09.341787 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hq7k2" event={"ID":"8b60380b-c057-44ba-a4fa-e88d67bde60a","Type":"ContainerStarted","Data":"4f64665a54c6c8854f17354d839e7f739edb028c539a11ce295616c703f7c01f"} Mar 13 01:45:09.346117 master-0 kubenswrapper[19170]: I0313 01:45:09.345319 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-wmh5p" event={"ID":"35017229-acfd-498d-82eb-8fc288e299b4","Type":"ContainerStarted","Data":"84d6e060ff0a75c1303fd04e28d780e19f848577a77b31bf4f57b7e03d5bfa22"} Mar 13 01:45:09.346117 master-0 kubenswrapper[19170]: I0313 01:45:09.345347 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-wmh5p" event={"ID":"35017229-acfd-498d-82eb-8fc288e299b4","Type":"ContainerStarted","Data":"3a4d120f5b4eff6a6f3e60893f985fbc6ff6baf1c9c3057afd29fae04bf9eb86"} Mar 13 01:45:09.349248 master-0 kubenswrapper[19170]: I0313 01:45:09.349135 19170 generic.go:334] "Generic (PLEG): container finished" podID="2b2398a0-76d0-4e91-abf0-47f6d43feb12" containerID="ce65246a40ac561f8cc14e142f9082c0faa6caef54523b5749b53610785bbd4c" exitCode=0 Mar 13 01:45:09.349248 master-0 kubenswrapper[19170]: I0313 01:45:09.349197 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dab-account-create-update-w8wtd" event={"ID":"2b2398a0-76d0-4e91-abf0-47f6d43feb12","Type":"ContainerDied","Data":"ce65246a40ac561f8cc14e142f9082c0faa6caef54523b5749b53610785bbd4c"} Mar 13 01:45:09.350715 master-0 kubenswrapper[19170]: I0313 01:45:09.350683 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" event={"ID":"9a89c46c-68a8-4baa-b926-3cbd5b85161c","Type":"ContainerStarted","Data":"b54403c6cc2ccfad0abafdd9af5764fbac9e86f86dc37927a2f50a2936a8ca54"} Mar 13 01:45:09.351922 master-0 kubenswrapper[19170]: I0313 01:45:09.351899 19170 generic.go:334] "Generic (PLEG): container finished" podID="26e2c508-1828-4edc-b60e-eea9598f292e" containerID="e77800cde0b38d338b79526b48f64dbd8c3571019a5d96708b627fa0487b1616" exitCode=0 Mar 13 01:45:09.351983 master-0 kubenswrapper[19170]: I0313 01:45:09.351940 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" event={"ID":"26e2c508-1828-4edc-b60e-eea9598f292e","Type":"ContainerDied","Data":"e77800cde0b38d338b79526b48f64dbd8c3571019a5d96708b627fa0487b1616"} Mar 13 01:45:09.364536 master-0 kubenswrapper[19170]: I0313 01:45:09.364297 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmwsf" event={"ID":"fe1a1216-7989-411d-8b62-6d12fefcc8ae","Type":"ContainerStarted","Data":"7c9a685dd5818a5aa39096c57a07cd4601ed40fd49b319969ad79d4826f591b7"} Mar 13 01:45:09.364536 master-0 kubenswrapper[19170]: I0313 01:45:09.364326 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmwsf" event={"ID":"fe1a1216-7989-411d-8b62-6d12fefcc8ae","Type":"ContainerStarted","Data":"d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942"} Mar 13 01:45:09.375665 master-0 kubenswrapper[19170]: I0313 01:45:09.375593 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hq7k2" podStartSLOduration=3.375579854 podStartE2EDuration="3.375579854s" podCreationTimestamp="2026-03-13 01:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:09.370562332 +0000 UTC m=+1570.178683352" watchObservedRunningTime="2026-03-13 01:45:09.375579854 +0000 UTC m=+1570.183700814" Mar 13 01:45:09.382336 master-0 kubenswrapper[19170]: I0313 01:45:09.382252 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" event={"ID":"881f7f20-50c9-4b96-b0a9-a14578bd3b6a","Type":"ContainerStarted","Data":"5691717671dfd567adad8a4888951adf10a26b714dcde264d081299cc7542f4c"} Mar 13 01:45:09.400263 master-0 kubenswrapper[19170]: I0313 01:45:09.400160 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-db-sync-5fc74" event={"ID":"7dbc43a0-77ee-41aa-87b2-730586a6fae4","Type":"ContainerStarted","Data":"bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11"} Mar 13 01:45:09.429809 master-0 kubenswrapper[19170]: I0313 01:45:09.426862 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-create-wmh5p" podStartSLOduration=3.426844519 podStartE2EDuration="3.426844519s" podCreationTimestamp="2026-03-13 01:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:09.406158116 +0000 UTC m=+1570.214279076" watchObservedRunningTime="2026-03-13 01:45:09.426844519 +0000 UTC m=+1570.234965479" Mar 13 01:45:09.483732 master-0 kubenswrapper[19170]: I0313 01:45:09.483599 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hmwsf" podStartSLOduration=3.483574228 podStartE2EDuration="3.483574228s" podCreationTimestamp="2026-03-13 01:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:09.445924227 +0000 UTC m=+1570.254045187" watchObservedRunningTime="2026-03-13 01:45:09.483574228 +0000 UTC m=+1570.291695188" Mar 13 01:45:09.665815 master-0 kubenswrapper[19170]: I0313 01:45:09.665221 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:09.675892 master-0 kubenswrapper[19170]: I0313 01:45:09.669116 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.675892 master-0 kubenswrapper[19170]: I0313 01:45:09.671064 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-external-config-data" Mar 13 01:45:09.675892 master-0 kubenswrapper[19170]: I0313 01:45:09.671416 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 01:45:09.817656 master-0 kubenswrapper[19170]: I0313 01:45:09.815696 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827326 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827614 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827709 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827751 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827811 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827836 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.831650 master-0 kubenswrapper[19170]: I0313 01:45:09.827855 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zhq\" (UniqueName: \"kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.921849 master-0 kubenswrapper[19170]: I0313 01:45:09.921724 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:09.930127 master-0 kubenswrapper[19170]: I0313 01:45:09.930034 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.930127 master-0 kubenswrapper[19170]: I0313 01:45:09.930112 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.930839 master-0 kubenswrapper[19170]: I0313 01:45:09.930174 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.930839 master-0 kubenswrapper[19170]: I0313 01:45:09.930571 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.930839 master-0 kubenswrapper[19170]: I0313 01:45:09.930797 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zhq\" (UniqueName: \"kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.931029 master-0 kubenswrapper[19170]: I0313 01:45:09.930851 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.931029 master-0 kubenswrapper[19170]: I0313 01:45:09.930891 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.936284 master-0 kubenswrapper[19170]: I0313 01:45:09.933387 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.936284 master-0 kubenswrapper[19170]: I0313 01:45:09.933972 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.936284 master-0 kubenswrapper[19170]: I0313 01:45:09.936205 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:09.936284 master-0 kubenswrapper[19170]: I0313 01:45:09.936231 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/dc901c00d5023825c7bc5ea32a35825a63b53b24b9433470d442fe02f9c29cd0/globalmount\"" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.940175 master-0 kubenswrapper[19170]: I0313 01:45:09.938807 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.940980 master-0 kubenswrapper[19170]: I0313 01:45:09.940944 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.954801 master-0 kubenswrapper[19170]: I0313 01:45:09.954756 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:09.962264 master-0 kubenswrapper[19170]: I0313 01:45:09.961983 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zhq\" (UniqueName: \"kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:10.032960 master-0 kubenswrapper[19170]: I0313 01:45:10.032921 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.032960 master-0 kubenswrapper[19170]: I0313 01:45:10.032970 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.033381 master-0 kubenswrapper[19170]: I0313 01:45:10.032995 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.033381 master-0 kubenswrapper[19170]: I0313 01:45:10.033056 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.033381 master-0 kubenswrapper[19170]: I0313 01:45:10.033093 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.033381 master-0 kubenswrapper[19170]: I0313 01:45:10.033111 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsjj\" (UniqueName: \"kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj\") pod \"26e2c508-1828-4edc-b60e-eea9598f292e\" (UID: \"26e2c508-1828-4edc-b60e-eea9598f292e\") " Mar 13 01:45:10.046661 master-0 kubenswrapper[19170]: I0313 01:45:10.036512 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj" (OuterVolumeSpecName: "kube-api-access-2dsjj") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "kube-api-access-2dsjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:10.046661 master-0 kubenswrapper[19170]: I0313 01:45:10.037181 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsjj\" (UniqueName: \"kubernetes.io/projected/26e2c508-1828-4edc-b60e-eea9598f292e-kube-api-access-2dsjj\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.076964 master-0 kubenswrapper[19170]: I0313 01:45:10.076922 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config" (OuterVolumeSpecName: "config") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:10.084061 master-0 kubenswrapper[19170]: I0313 01:45:10.084010 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:10.084300 master-0 kubenswrapper[19170]: I0313 01:45:10.084272 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:10.086214 master-0 kubenswrapper[19170]: I0313 01:45:10.086166 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:10.102752 master-0 kubenswrapper[19170]: I0313 01:45:10.101850 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26e2c508-1828-4edc-b60e-eea9598f292e" (UID: "26e2c508-1828-4edc-b60e-eea9598f292e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.138342 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.139379 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.139412 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.139421 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.139430 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.139474 master-0 kubenswrapper[19170]: I0313 01:45:10.139441 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26e2c508-1828-4edc-b60e-eea9598f292e-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.152607 master-0 kubenswrapper[19170]: E0313 01:45:10.150031 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e2c508-1828-4edc-b60e-eea9598f292e" containerName="init" Mar 13 01:45:10.152607 master-0 kubenswrapper[19170]: I0313 01:45:10.150065 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e2c508-1828-4edc-b60e-eea9598f292e" containerName="init" Mar 13 01:45:10.152607 master-0 kubenswrapper[19170]: I0313 01:45:10.150364 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e2c508-1828-4edc-b60e-eea9598f292e" containerName="init" Mar 13 01:45:10.152607 master-0 kubenswrapper[19170]: I0313 01:45:10.151880 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.161710 master-0 kubenswrapper[19170]: I0313 01:45:10.161672 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-internal-config-data" Mar 13 01:45:10.181268 master-0 kubenswrapper[19170]: I0313 01:45:10.181223 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:10.241312 master-0 kubenswrapper[19170]: I0313 01:45:10.241274 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.241689 master-0 kubenswrapper[19170]: I0313 01:45:10.241669 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.241793 master-0 kubenswrapper[19170]: I0313 01:45:10.241780 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.241993 master-0 kubenswrapper[19170]: I0313 01:45:10.241977 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.242117 master-0 kubenswrapper[19170]: I0313 01:45:10.242094 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc5fk\" (UniqueName: \"kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.242448 master-0 kubenswrapper[19170]: I0313 01:45:10.242427 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.246544 master-0 kubenswrapper[19170]: I0313 01:45:10.242762 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.262921 master-0 kubenswrapper[19170]: I0313 01:45:10.261911 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:10.262921 master-0 kubenswrapper[19170]: E0313 01:45:10.262663 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-b9844-default-external-api-0" podUID="162f6d26-33ce-450b-9f01-ace6b922c69b" Mar 13 01:45:10.340364 master-0 kubenswrapper[19170]: I0313 01:45:10.340227 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:10.344579 master-0 kubenswrapper[19170]: I0313 01:45:10.344532 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.344668 master-0 kubenswrapper[19170]: I0313 01:45:10.344644 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.344710 master-0 kubenswrapper[19170]: I0313 01:45:10.344673 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.345273 master-0 kubenswrapper[19170]: I0313 01:45:10.345243 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.345689 master-0 kubenswrapper[19170]: I0313 01:45:10.345648 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.345832 master-0 kubenswrapper[19170]: I0313 01:45:10.345793 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.346080 master-0 kubenswrapper[19170]: I0313 01:45:10.346055 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc5fk\" (UniqueName: \"kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.346230 master-0 kubenswrapper[19170]: I0313 01:45:10.346210 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.347988 master-0 kubenswrapper[19170]: I0313 01:45:10.347970 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.348626 master-0 kubenswrapper[19170]: E0313 01:45:10.348571 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-gc5fk logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-b9844-default-internal-api-0" podUID="8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" Mar 13 01:45:10.349022 master-0 kubenswrapper[19170]: I0313 01:45:10.348992 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.349613 master-0 kubenswrapper[19170]: I0313 01:45:10.349594 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:10.349757 master-0 kubenswrapper[19170]: I0313 01:45:10.349741 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/34cbbf4fc438e2d42966b02d86ef558a7de843a3eec70be4281d747be1ba2c15/globalmount\"" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.356080 master-0 kubenswrapper[19170]: I0313 01:45:10.351439 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.380698 master-0 kubenswrapper[19170]: I0313 01:45:10.380647 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.390774 master-0 kubenswrapper[19170]: I0313 01:45:10.386278 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc5fk\" (UniqueName: \"kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.432727 master-0 kubenswrapper[19170]: I0313 01:45:10.430570 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" event={"ID":"26e2c508-1828-4edc-b60e-eea9598f292e","Type":"ContainerDied","Data":"d750ef6752a1c178e3c374cc53b34c2505649e39c5ea8d4a43b5fc1bf319170b"} Mar 13 01:45:10.432727 master-0 kubenswrapper[19170]: I0313 01:45:10.430701 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56b5f74d97-8m7sq" Mar 13 01:45:10.432727 master-0 kubenswrapper[19170]: I0313 01:45:10.431339 19170 scope.go:117] "RemoveContainer" containerID="e77800cde0b38d338b79526b48f64dbd8c3571019a5d96708b627fa0487b1616" Mar 13 01:45:10.443534 master-0 kubenswrapper[19170]: I0313 01:45:10.443488 19170 generic.go:334] "Generic (PLEG): container finished" podID="35017229-acfd-498d-82eb-8fc288e299b4" containerID="84d6e060ff0a75c1303fd04e28d780e19f848577a77b31bf4f57b7e03d5bfa22" exitCode=0 Mar 13 01:45:10.443710 master-0 kubenswrapper[19170]: I0313 01:45:10.443559 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-wmh5p" event={"ID":"35017229-acfd-498d-82eb-8fc288e299b4","Type":"ContainerDied","Data":"84d6e060ff0a75c1303fd04e28d780e19f848577a77b31bf4f57b7e03d5bfa22"} Mar 13 01:45:10.457495 master-0 kubenswrapper[19170]: I0313 01:45:10.456886 19170 generic.go:334] "Generic (PLEG): container finished" podID="881f7f20-50c9-4b96-b0a9-a14578bd3b6a" containerID="faf79cbab7d1a38961c0e09626ecc0a7c900d7d688084fb93dc693b8b6718300" exitCode=0 Mar 13 01:45:10.457495 master-0 kubenswrapper[19170]: I0313 01:45:10.456955 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" event={"ID":"881f7f20-50c9-4b96-b0a9-a14578bd3b6a","Type":"ContainerDied","Data":"faf79cbab7d1a38961c0e09626ecc0a7c900d7d688084fb93dc693b8b6718300"} Mar 13 01:45:10.461775 master-0 kubenswrapper[19170]: I0313 01:45:10.461717 19170 generic.go:334] "Generic (PLEG): container finished" podID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerID="03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872" exitCode=0 Mar 13 01:45:10.461968 master-0 kubenswrapper[19170]: I0313 01:45:10.461807 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:10.463266 master-0 kubenswrapper[19170]: I0313 01:45:10.463234 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" event={"ID":"9a89c46c-68a8-4baa-b926-3cbd5b85161c","Type":"ContainerDied","Data":"03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872"} Mar 13 01:45:10.464080 master-0 kubenswrapper[19170]: I0313 01:45:10.464058 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.486769 master-0 kubenswrapper[19170]: I0313 01:45:10.484397 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:10.533087 master-0 kubenswrapper[19170]: I0313 01:45:10.533049 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549252 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc5fk\" (UniqueName: \"kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549302 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549326 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549347 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549383 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46zhq\" (UniqueName: \"kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549411 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549432 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549485 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549530 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549641 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549679 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.549699 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.552256 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs" (OuterVolumeSpecName: "logs") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.552447 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs" (OuterVolumeSpecName: "logs") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:10.553711 master-0 kubenswrapper[19170]: I0313 01:45:10.552466 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:10.554295 master-0 kubenswrapper[19170]: I0313 01:45:10.554059 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:10.564930 master-0 kubenswrapper[19170]: I0313 01:45:10.561919 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts" (OuterVolumeSpecName: "scripts") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.564930 master-0 kubenswrapper[19170]: I0313 01:45:10.564817 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.581666 master-0 kubenswrapper[19170]: I0313 01:45:10.579479 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.587655 master-0 kubenswrapper[19170]: I0313 01:45:10.583665 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts" (OuterVolumeSpecName: "scripts") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.587655 master-0 kubenswrapper[19170]: I0313 01:45:10.583670 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data" (OuterVolumeSpecName: "config-data") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.587655 master-0 kubenswrapper[19170]: I0313 01:45:10.583788 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq" (OuterVolumeSpecName: "kube-api-access-46zhq") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "kube-api-access-46zhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:10.587655 master-0 kubenswrapper[19170]: I0313 01:45:10.583963 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk" (OuterVolumeSpecName: "kube-api-access-gc5fk") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "kube-api-access-gc5fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:10.621639 master-0 kubenswrapper[19170]: I0313 01:45:10.614826 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data" (OuterVolumeSpecName: "config-data") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:10.626899 master-0 kubenswrapper[19170]: I0313 01:45:10.626383 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653021 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653057 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653069 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653080 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc5fk\" (UniqueName: \"kubernetes.io/projected/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-kube-api-access-gc5fk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653089 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653098 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653107 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/162f6d26-33ce-450b-9f01-ace6b922c69b-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653115 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46zhq\" (UniqueName: \"kubernetes.io/projected/162f6d26-33ce-450b-9f01-ace6b922c69b-kube-api-access-46zhq\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653123 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653131 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653138 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162f6d26-33ce-450b-9f01-ace6b922c69b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.653955 master-0 kubenswrapper[19170]: I0313 01:45:10.653149 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:10.673972 master-0 kubenswrapper[19170]: I0313 01:45:10.667184 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56b5f74d97-8m7sq"] Mar 13 01:45:11.237753 master-0 kubenswrapper[19170]: I0313 01:45:11.237677 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:11.271763 master-0 kubenswrapper[19170]: I0313 01:45:11.269297 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:11.299883 master-0 kubenswrapper[19170]: I0313 01:45:11.299841 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7lj9\" (UniqueName: \"kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9\") pod \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " Mar 13 01:45:11.300416 master-0 kubenswrapper[19170]: I0313 01:45:11.300395 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts\") pod \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\" (UID: \"2b2398a0-76d0-4e91-abf0-47f6d43feb12\") " Mar 13 01:45:11.302258 master-0 kubenswrapper[19170]: I0313 01:45:11.302239 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b2398a0-76d0-4e91-abf0-47f6d43feb12" (UID: "2b2398a0-76d0-4e91-abf0-47f6d43feb12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.303373 master-0 kubenswrapper[19170]: I0313 01:45:11.303235 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9" (OuterVolumeSpecName: "kube-api-access-q7lj9") pod "2b2398a0-76d0-4e91-abf0-47f6d43feb12" (UID: "2b2398a0-76d0-4e91-abf0-47f6d43feb12"). InnerVolumeSpecName "kube-api-access-q7lj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:11.381974 master-0 kubenswrapper[19170]: I0313 01:45:11.381263 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403268 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2kqs\" (UniqueName: \"kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403331 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403352 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403442 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403497 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.403596 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb\") pod \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\" (UID: \"881f7f20-50c9-4b96-b0a9-a14578bd3b6a\") " Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.404078 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b2398a0-76d0-4e91-abf0-47f6d43feb12-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.407687 master-0 kubenswrapper[19170]: I0313 01:45:11.404102 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7lj9\" (UniqueName: \"kubernetes.io/projected/2b2398a0-76d0-4e91-abf0-47f6d43feb12-kube-api-access-q7lj9\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.436102 master-0 kubenswrapper[19170]: I0313 01:45:11.434902 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs" (OuterVolumeSpecName: "kube-api-access-p2kqs") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "kube-api-access-p2kqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:11.437034 master-0 kubenswrapper[19170]: I0313 01:45:11.436983 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.453133 master-0 kubenswrapper[19170]: I0313 01:45:11.453063 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.463450 master-0 kubenswrapper[19170]: I0313 01:45:11.463403 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.466382 master-0 kubenswrapper[19170]: I0313 01:45:11.466354 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config" (OuterVolumeSpecName: "config") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.484317 master-0 kubenswrapper[19170]: I0313 01:45:11.469094 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "881f7f20-50c9-4b96-b0a9-a14578bd3b6a" (UID: "881f7f20-50c9-4b96-b0a9-a14578bd3b6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:11.504701 master-0 kubenswrapper[19170]: I0313 01:45:11.493982 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e2c508-1828-4edc-b60e-eea9598f292e" path="/var/lib/kubelet/pods/26e2c508-1828-4edc-b60e-eea9598f292e/volumes" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.513861 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"162f6d26-33ce-450b-9f01-ace6b922c69b\" (UID: \"162f6d26-33ce-450b-9f01-ace6b922c69b\") " Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514419 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514433 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2kqs\" (UniqueName: \"kubernetes.io/projected/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-kube-api-access-p2kqs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514443 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514451 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514459 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.520493 master-0 kubenswrapper[19170]: I0313 01:45:11.514467 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/881f7f20-50c9-4b96-b0a9-a14578bd3b6a-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:11.526994 master-0 kubenswrapper[19170]: I0313 01:45:11.524457 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" Mar 13 01:45:11.526994 master-0 kubenswrapper[19170]: I0313 01:45:11.524728 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbdc4df5f-jkj9p" event={"ID":"881f7f20-50c9-4b96-b0a9-a14578bd3b6a","Type":"ContainerDied","Data":"5691717671dfd567adad8a4888951adf10a26b714dcde264d081299cc7542f4c"} Mar 13 01:45:11.526994 master-0 kubenswrapper[19170]: I0313 01:45:11.524834 19170 scope.go:117] "RemoveContainer" containerID="faf79cbab7d1a38961c0e09626ecc0a7c900d7d688084fb93dc693b8b6718300" Mar 13 01:45:11.565361 master-0 kubenswrapper[19170]: I0313 01:45:11.565320 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dab-account-create-update-w8wtd" Mar 13 01:45:11.565454 master-0 kubenswrapper[19170]: I0313 01:45:11.565399 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dab-account-create-update-w8wtd" event={"ID":"2b2398a0-76d0-4e91-abf0-47f6d43feb12","Type":"ContainerDied","Data":"8cfb1f79f6b3d33e3bffb7e1707dfe6f92544578db5f43b445388519919d9e4e"} Mar 13 01:45:11.565454 master-0 kubenswrapper[19170]: I0313 01:45:11.565439 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfb1f79f6b3d33e3bffb7e1707dfe6f92544578db5f43b445388519919d9e4e" Mar 13 01:45:11.581443 master-0 kubenswrapper[19170]: I0313 01:45:11.581329 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" event={"ID":"9a89c46c-68a8-4baa-b926-3cbd5b85161c","Type":"ContainerStarted","Data":"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603"} Mar 13 01:45:11.582555 master-0 kubenswrapper[19170]: I0313 01:45:11.582165 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:11.587856 master-0 kubenswrapper[19170]: I0313 01:45:11.587824 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:11.588051 master-0 kubenswrapper[19170]: I0313 01:45:11.588031 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.700844 master-0 kubenswrapper[19170]: I0313 01:45:11.700778 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:11.734243 master-0 kubenswrapper[19170]: I0313 01:45:11.734101 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbdc4df5f-jkj9p"] Mar 13 01:45:11.749651 master-0 kubenswrapper[19170]: I0313 01:45:11.749063 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" podStartSLOduration=4.749042675 podStartE2EDuration="4.749042675s" podCreationTimestamp="2026-03-13 01:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:11.676030877 +0000 UTC m=+1572.484151837" watchObservedRunningTime="2026-03-13 01:45:11.749042675 +0000 UTC m=+1572.557163635" Mar 13 01:45:11.862725 master-0 kubenswrapper[19170]: I0313 01:45:11.862675 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:11.871947 master-0 kubenswrapper[19170]: I0313 01:45:11.871407 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:11.881444 master-0 kubenswrapper[19170]: I0313 01:45:11.881395 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:11.881928 master-0 kubenswrapper[19170]: E0313 01:45:11.881910 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881f7f20-50c9-4b96-b0a9-a14578bd3b6a" containerName="init" Mar 13 01:45:11.881928 master-0 kubenswrapper[19170]: I0313 01:45:11.881928 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7f20-50c9-4b96-b0a9-a14578bd3b6a" containerName="init" Mar 13 01:45:11.882043 master-0 kubenswrapper[19170]: E0313 01:45:11.881954 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2398a0-76d0-4e91-abf0-47f6d43feb12" containerName="mariadb-account-create-update" Mar 13 01:45:11.882043 master-0 kubenswrapper[19170]: I0313 01:45:11.881961 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2398a0-76d0-4e91-abf0-47f6d43feb12" containerName="mariadb-account-create-update" Mar 13 01:45:11.882452 master-0 kubenswrapper[19170]: I0313 01:45:11.882433 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2398a0-76d0-4e91-abf0-47f6d43feb12" containerName="mariadb-account-create-update" Mar 13 01:45:11.882503 master-0 kubenswrapper[19170]: I0313 01:45:11.882478 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="881f7f20-50c9-4b96-b0a9-a14578bd3b6a" containerName="init" Mar 13 01:45:11.883570 master-0 kubenswrapper[19170]: I0313 01:45:11.883540 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.885563 master-0 kubenswrapper[19170]: I0313 01:45:11.885532 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-internal-config-data" Mar 13 01:45:11.891878 master-0 kubenswrapper[19170]: I0313 01:45:11.891339 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939129 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939376 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95ts\" (UniqueName: \"kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939432 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939513 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939569 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:11.940768 master-0 kubenswrapper[19170]: I0313 01:45:11.939675 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.043354 master-0 kubenswrapper[19170]: I0313 01:45:12.043310 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.046535 master-0 kubenswrapper[19170]: I0313 01:45:12.044394 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.046535 master-0 kubenswrapper[19170]: I0313 01:45:12.044451 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.046535 master-0 kubenswrapper[19170]: I0313 01:45:12.044701 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c95ts\" (UniqueName: \"kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.046535 master-0 kubenswrapper[19170]: I0313 01:45:12.045020 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.048476 master-0 kubenswrapper[19170]: I0313 01:45:12.047993 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.048476 master-0 kubenswrapper[19170]: I0313 01:45:12.048326 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.048476 master-0 kubenswrapper[19170]: I0313 01:45:12.048454 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.053271 master-0 kubenswrapper[19170]: I0313 01:45:12.053109 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.056678 master-0 kubenswrapper[19170]: I0313 01:45:12.055740 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.071107 master-0 kubenswrapper[19170]: I0313 01:45:12.067306 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.091511 master-0 kubenswrapper[19170]: I0313 01:45:12.091464 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95ts\" (UniqueName: \"kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:12.261314 master-0 kubenswrapper[19170]: I0313 01:45:12.260917 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:12.353837 master-0 kubenswrapper[19170]: I0313 01:45:12.353694 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts\") pod \"35017229-acfd-498d-82eb-8fc288e299b4\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " Mar 13 01:45:12.353837 master-0 kubenswrapper[19170]: I0313 01:45:12.353773 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjtvs\" (UniqueName: \"kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs\") pod \"35017229-acfd-498d-82eb-8fc288e299b4\" (UID: \"35017229-acfd-498d-82eb-8fc288e299b4\") " Mar 13 01:45:12.356779 master-0 kubenswrapper[19170]: I0313 01:45:12.356731 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35017229-acfd-498d-82eb-8fc288e299b4" (UID: "35017229-acfd-498d-82eb-8fc288e299b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:12.358392 master-0 kubenswrapper[19170]: I0313 01:45:12.358349 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs" (OuterVolumeSpecName: "kube-api-access-cjtvs") pod "35017229-acfd-498d-82eb-8fc288e299b4" (UID: "35017229-acfd-498d-82eb-8fc288e299b4"). InnerVolumeSpecName "kube-api-access-cjtvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:12.463976 master-0 kubenswrapper[19170]: I0313 01:45:12.460798 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35017229-acfd-498d-82eb-8fc288e299b4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:12.463976 master-0 kubenswrapper[19170]: I0313 01:45:12.460831 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjtvs\" (UniqueName: \"kubernetes.io/projected/35017229-acfd-498d-82eb-8fc288e299b4-kube-api-access-cjtvs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:12.605254 master-0 kubenswrapper[19170]: I0313 01:45:12.605150 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-wmh5p" Mar 13 01:45:12.605424 master-0 kubenswrapper[19170]: I0313 01:45:12.605150 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-wmh5p" event={"ID":"35017229-acfd-498d-82eb-8fc288e299b4","Type":"ContainerDied","Data":"3a4d120f5b4eff6a6f3e60893f985fbc6ff6baf1c9c3057afd29fae04bf9eb86"} Mar 13 01:45:12.605424 master-0 kubenswrapper[19170]: I0313 01:45:12.605328 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a4d120f5b4eff6a6f3e60893f985fbc6ff6baf1c9c3057afd29fae04bf9eb86" Mar 13 01:45:13.110497 master-0 kubenswrapper[19170]: I0313 01:45:13.110454 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:13.111126 master-0 kubenswrapper[19170]: I0313 01:45:13.111081 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb" (OuterVolumeSpecName: "glance") pod "162f6d26-33ce-450b-9f01-ace6b922c69b" (UID: "162f6d26-33ce-450b-9f01-ace6b922c69b"). InnerVolumeSpecName "pvc-60c39735-cd4e-4baf-8a95-3babad891e79". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:45:13.177177 master-0 kubenswrapper[19170]: I0313 01:45:13.176795 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\" (UID: \"8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256\") " Mar 13 01:45:13.178106 master-0 kubenswrapper[19170]: I0313 01:45:13.178079 19170 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" " Mar 13 01:45:13.205736 master-0 kubenswrapper[19170]: I0313 01:45:13.205688 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c" (OuterVolumeSpecName: "glance") pod "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" (UID: "8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256"). InnerVolumeSpecName "pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:45:13.214554 master-0 kubenswrapper[19170]: I0313 01:45:13.214517 19170 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 01:45:13.214886 master-0 kubenswrapper[19170]: I0313 01:45:13.214861 19170 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-60c39735-cd4e-4baf-8a95-3babad891e79" (UniqueName: "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb") on node "master-0" Mar 13 01:45:13.281903 master-0 kubenswrapper[19170]: I0313 01:45:13.281845 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:13.282346 master-0 kubenswrapper[19170]: I0313 01:45:13.282059 19170 reconciler_common.go:293] "Volume detached for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:13.434247 master-0 kubenswrapper[19170]: I0313 01:45:13.434191 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881f7f20-50c9-4b96-b0a9-a14578bd3b6a" path="/var/lib/kubelet/pods/881f7f20-50c9-4b96-b0a9-a14578bd3b6a/volumes" Mar 13 01:45:13.435306 master-0 kubenswrapper[19170]: I0313 01:45:13.435257 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256" path="/var/lib/kubelet/pods/8a5b5ffc-f6b2-4969-8b24-9d3ab4ea2256/volumes" Mar 13 01:45:13.616658 master-0 kubenswrapper[19170]: I0313 01:45:13.615102 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:13.661158 master-0 kubenswrapper[19170]: I0313 01:45:13.661118 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:13.699949 master-0 kubenswrapper[19170]: I0313 01:45:13.699895 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:13.703284 master-0 kubenswrapper[19170]: E0313 01:45:13.701233 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35017229-acfd-498d-82eb-8fc288e299b4" containerName="mariadb-database-create" Mar 13 01:45:13.703284 master-0 kubenswrapper[19170]: I0313 01:45:13.701269 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="35017229-acfd-498d-82eb-8fc288e299b4" containerName="mariadb-database-create" Mar 13 01:45:13.703284 master-0 kubenswrapper[19170]: I0313 01:45:13.701488 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="35017229-acfd-498d-82eb-8fc288e299b4" containerName="mariadb-database-create" Mar 13 01:45:13.703284 master-0 kubenswrapper[19170]: I0313 01:45:13.702555 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.704529 master-0 kubenswrapper[19170]: I0313 01:45:13.704374 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-external-config-data" Mar 13 01:45:13.722019 master-0 kubenswrapper[19170]: I0313 01:45:13.719977 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812618 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812717 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812744 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812767 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812783 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812806 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.813091 master-0 kubenswrapper[19170]: I0313 01:45:13.812839 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx898\" (UniqueName: \"kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915078 master-0 kubenswrapper[19170]: I0313 01:45:13.914981 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915078 master-0 kubenswrapper[19170]: I0313 01:45:13.915037 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915078 master-0 kubenswrapper[19170]: I0313 01:45:13.915067 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915078 master-0 kubenswrapper[19170]: I0313 01:45:13.915083 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915388 master-0 kubenswrapper[19170]: I0313 01:45:13.915109 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915388 master-0 kubenswrapper[19170]: I0313 01:45:13.915142 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx898\" (UniqueName: \"kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915388 master-0 kubenswrapper[19170]: I0313 01:45:13.915229 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.915725 master-0 kubenswrapper[19170]: I0313 01:45:13.915694 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.917355 master-0 kubenswrapper[19170]: I0313 01:45:13.917323 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:13.917441 master-0 kubenswrapper[19170]: I0313 01:45:13.917352 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/dc901c00d5023825c7bc5ea32a35825a63b53b24b9433470d442fe02f9c29cd0/globalmount\"" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.919326 master-0 kubenswrapper[19170]: I0313 01:45:13.919278 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.921652 master-0 kubenswrapper[19170]: I0313 01:45:13.921615 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.931186 master-0 kubenswrapper[19170]: I0313 01:45:13.931001 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.932342 master-0 kubenswrapper[19170]: I0313 01:45:13.932294 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:13.933980 master-0 kubenswrapper[19170]: I0313 01:45:13.933948 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx898\" (UniqueName: \"kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:14.684226 master-0 kubenswrapper[19170]: I0313 01:45:14.684171 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:14.919344 master-0 kubenswrapper[19170]: I0313 01:45:14.919181 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:15.433052 master-0 kubenswrapper[19170]: I0313 01:45:15.432982 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162f6d26-33ce-450b-9f01-ace6b922c69b" path="/var/lib/kubelet/pods/162f6d26-33ce-450b-9f01-ace6b922c69b/volumes" Mar 13 01:45:15.665498 master-0 kubenswrapper[19170]: I0313 01:45:15.665356 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mmsc6" event={"ID":"e445697b-863d-4e96-8ccb-581dacecddcd","Type":"ContainerStarted","Data":"26c720c0b5d40fee814f609e5fa85619f7b58f50441d5ded3c73027f849fbba7"} Mar 13 01:45:15.667539 master-0 kubenswrapper[19170]: I0313 01:45:15.667480 19170 generic.go:334] "Generic (PLEG): container finished" podID="8b60380b-c057-44ba-a4fa-e88d67bde60a" containerID="a64028f80ad60b57b1351892013ed1c1fbf2f482c5391221d690745b5adcc42a" exitCode=0 Mar 13 01:45:15.667539 master-0 kubenswrapper[19170]: I0313 01:45:15.667531 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hq7k2" event={"ID":"8b60380b-c057-44ba-a4fa-e88d67bde60a","Type":"ContainerDied","Data":"a64028f80ad60b57b1351892013ed1c1fbf2f482c5391221d690745b5adcc42a"} Mar 13 01:45:15.934649 master-0 kubenswrapper[19170]: W0313 01:45:15.931199 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b35883_5cf2_442f_b296_139de330497c.slice/crio-005317ec2ac54ee5ebd6d223ea494e7e4095d87d48dc8d2a3b4fe2468c5481a5 WatchSource:0}: Error finding container 005317ec2ac54ee5ebd6d223ea494e7e4095d87d48dc8d2a3b4fe2468c5481a5: Status 404 returned error can't find the container with id 005317ec2ac54ee5ebd6d223ea494e7e4095d87d48dc8d2a3b4fe2468c5481a5 Mar 13 01:45:15.935472 master-0 kubenswrapper[19170]: I0313 01:45:15.935423 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:16.688547 master-0 kubenswrapper[19170]: I0313 01:45:16.688118 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerStarted","Data":"c6ffc4b7e72283fb8122d4d8f8c25808e43b58e5fc99e95db2ecef6c012727c3"} Mar 13 01:45:16.688547 master-0 kubenswrapper[19170]: I0313 01:45:16.688212 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerStarted","Data":"005317ec2ac54ee5ebd6d223ea494e7e4095d87d48dc8d2a3b4fe2468c5481a5"} Mar 13 01:45:16.715198 master-0 kubenswrapper[19170]: I0313 01:45:16.715106 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mmsc6" podStartSLOduration=4.546527849 podStartE2EDuration="10.715083385s" podCreationTimestamp="2026-03-13 01:45:06 +0000 UTC" firstStartedPulling="2026-03-13 01:45:08.681262032 +0000 UTC m=+1569.489382992" lastFinishedPulling="2026-03-13 01:45:14.849817568 +0000 UTC m=+1575.657938528" observedRunningTime="2026-03-13 01:45:16.702877121 +0000 UTC m=+1577.510998091" watchObservedRunningTime="2026-03-13 01:45:16.715083385 +0000 UTC m=+1577.523204345" Mar 13 01:45:16.897164 master-0 kubenswrapper[19170]: I0313 01:45:16.897107 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:17.027734 master-0 kubenswrapper[19170]: I0313 01:45:17.026099 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:17.128966 master-0 kubenswrapper[19170]: I0313 01:45:17.128901 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:17.255404 master-0 kubenswrapper[19170]: I0313 01:45:17.239899 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:17.294937 master-0 kubenswrapper[19170]: I0313 01:45:17.270490 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:17.319039 master-0 kubenswrapper[19170]: I0313 01:45:17.316510 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-l9ds6"] Mar 13 01:45:17.319039 master-0 kubenswrapper[19170]: E0313 01:45:17.317146 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b60380b-c057-44ba-a4fa-e88d67bde60a" containerName="keystone-bootstrap" Mar 13 01:45:17.319039 master-0 kubenswrapper[19170]: I0313 01:45:17.317164 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b60380b-c057-44ba-a4fa-e88d67bde60a" containerName="keystone-bootstrap" Mar 13 01:45:17.319039 master-0 kubenswrapper[19170]: I0313 01:45:17.317408 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b60380b-c057-44ba-a4fa-e88d67bde60a" containerName="keystone-bootstrap" Mar 13 01:45:17.319039 master-0 kubenswrapper[19170]: I0313 01:45:17.318596 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.341161 master-0 kubenswrapper[19170]: I0313 01:45:17.341078 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-l9ds6"] Mar 13 01:45:17.341245 master-0 kubenswrapper[19170]: I0313 01:45:17.340875 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 13 01:45:17.341487 master-0 kubenswrapper[19170]: I0313 01:45:17.341458 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 13 01:45:17.426458 master-0 kubenswrapper[19170]: I0313 01:45:17.426406 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.426660 master-0 kubenswrapper[19170]: I0313 01:45:17.426560 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx595\" (UniqueName: \"kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.426660 master-0 kubenswrapper[19170]: I0313 01:45:17.426618 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.426759 master-0 kubenswrapper[19170]: I0313 01:45:17.426688 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.426759 master-0 kubenswrapper[19170]: I0313 01:45:17.426744 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.426824 master-0 kubenswrapper[19170]: I0313 01:45:17.426781 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts\") pod \"8b60380b-c057-44ba-a4fa-e88d67bde60a\" (UID: \"8b60380b-c057-44ba-a4fa-e88d67bde60a\") " Mar 13 01:45:17.427425 master-0 kubenswrapper[19170]: I0313 01:45:17.427101 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.427425 master-0 kubenswrapper[19170]: I0313 01:45:17.427159 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56r7z\" (UniqueName: \"kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.427425 master-0 kubenswrapper[19170]: I0313 01:45:17.427315 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.427425 master-0 kubenswrapper[19170]: I0313 01:45:17.427353 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.427588 master-0 kubenswrapper[19170]: I0313 01:45:17.427454 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.427588 master-0 kubenswrapper[19170]: I0313 01:45:17.427479 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.433062 master-0 kubenswrapper[19170]: I0313 01:45:17.431573 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:17.433062 master-0 kubenswrapper[19170]: I0313 01:45:17.432059 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:17.435006 master-0 kubenswrapper[19170]: I0313 01:45:17.434387 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595" (OuterVolumeSpecName: "kube-api-access-sx595") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "kube-api-access-sx595". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:17.435236 master-0 kubenswrapper[19170]: I0313 01:45:17.435199 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts" (OuterVolumeSpecName: "scripts") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:17.451443 master-0 kubenswrapper[19170]: I0313 01:45:17.451398 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data" (OuterVolumeSpecName: "config-data") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:17.454357 master-0 kubenswrapper[19170]: I0313 01:45:17.454310 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b60380b-c057-44ba-a4fa-e88d67bde60a" (UID: "8b60380b-c057-44ba-a4fa-e88d67bde60a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.531525 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.531588 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.531863 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.531916 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532100 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532216 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56r7z\" (UniqueName: \"kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532771 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx595\" (UniqueName: \"kubernetes.io/projected/8b60380b-c057-44ba-a4fa-e88d67bde60a-kube-api-access-sx595\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532786 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532795 19170 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532803 19170 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532813 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.532822 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b60380b-c057-44ba-a4fa-e88d67bde60a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.533533 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.536359 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.538557 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.541666 master-0 kubenswrapper[19170]: I0313 01:45:17.540576 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.542585 master-0 kubenswrapper[19170]: I0313 01:45:17.542023 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.549779 master-0 kubenswrapper[19170]: I0313 01:45:17.548219 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56r7z\" (UniqueName: \"kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z\") pod \"ironic-db-sync-l9ds6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.669736 master-0 kubenswrapper[19170]: I0313 01:45:17.669560 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:17.701540 master-0 kubenswrapper[19170]: I0313 01:45:17.701380 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hq7k2" Mar 13 01:45:17.701540 master-0 kubenswrapper[19170]: I0313 01:45:17.701393 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hq7k2" event={"ID":"8b60380b-c057-44ba-a4fa-e88d67bde60a","Type":"ContainerDied","Data":"4f64665a54c6c8854f17354d839e7f739edb028c539a11ce295616c703f7c01f"} Mar 13 01:45:17.701867 master-0 kubenswrapper[19170]: I0313 01:45:17.701559 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f64665a54c6c8854f17354d839e7f739edb028c539a11ce295616c703f7c01f" Mar 13 01:45:17.703518 master-0 kubenswrapper[19170]: I0313 01:45:17.703479 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerStarted","Data":"d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be"} Mar 13 01:45:17.703716 master-0 kubenswrapper[19170]: I0313 01:45:17.703646 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-internal-api-0" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-log" containerID="cri-o://c6ffc4b7e72283fb8122d4d8f8c25808e43b58e5fc99e95db2ecef6c012727c3" gracePeriod=30 Mar 13 01:45:17.703793 master-0 kubenswrapper[19170]: I0313 01:45:17.703691 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-internal-api-0" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-httpd" containerID="cri-o://d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be" gracePeriod=30 Mar 13 01:45:17.793465 master-0 kubenswrapper[19170]: I0313 01:45:17.793373 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-internal-api-0" podStartSLOduration=6.793350538 podStartE2EDuration="6.793350538s" podCreationTimestamp="2026-03-13 01:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:17.787993577 +0000 UTC m=+1578.596114537" watchObservedRunningTime="2026-03-13 01:45:17.793350538 +0000 UTC m=+1578.601471498" Mar 13 01:45:18.062694 master-0 kubenswrapper[19170]: E0313 01:45:18.060841 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b35883_5cf2_442f_b296_139de330497c.slice/crio-conmon-d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65b35883_5cf2_442f_b296_139de330497c.slice/crio-d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:45:18.339880 master-0 kubenswrapper[19170]: I0313 01:45:18.339800 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-l9ds6"] Mar 13 01:45:18.348388 master-0 kubenswrapper[19170]: W0313 01:45:18.346776 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc3c71f3_cecf_4916_8841_0e557aad23d6.slice/crio-464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090 WatchSource:0}: Error finding container 464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090: Status 404 returned error can't find the container with id 464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090 Mar 13 01:45:18.433719 master-0 kubenswrapper[19170]: I0313 01:45:18.432982 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:45:18.474101 master-0 kubenswrapper[19170]: I0313 01:45:18.470386 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:18.497370 master-0 kubenswrapper[19170]: I0313 01:45:18.495859 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hq7k2"] Mar 13 01:45:18.522108 master-0 kubenswrapper[19170]: I0313 01:45:18.521422 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hq7k2"] Mar 13 01:45:18.602895 master-0 kubenswrapper[19170]: I0313 01:45:18.600026 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:45:18.602895 master-0 kubenswrapper[19170]: I0313 01:45:18.600310 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="dnsmasq-dns" containerID="cri-o://f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35" gracePeriod=10 Mar 13 01:45:18.665529 master-0 kubenswrapper[19170]: I0313 01:45:18.665099 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cx8d7"] Mar 13 01:45:18.668919 master-0 kubenswrapper[19170]: I0313 01:45:18.666752 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.697427 master-0 kubenswrapper[19170]: I0313 01:45:18.697164 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 01:45:18.697427 master-0 kubenswrapper[19170]: I0313 01:45:18.697215 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 01:45:18.697427 master-0 kubenswrapper[19170]: I0313 01:45:18.697300 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 01:45:18.706816 master-0 kubenswrapper[19170]: I0313 01:45:18.706738 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cx8d7"] Mar 13 01:45:18.809533 master-0 kubenswrapper[19170]: I0313 01:45:18.805564 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-l9ds6" event={"ID":"dc3c71f3-cecf-4916-8841-0e557aad23d6","Type":"ContainerStarted","Data":"464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090"} Mar 13 01:45:18.818722 master-0 kubenswrapper[19170]: I0313 01:45:18.818318 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerStarted","Data":"b8354c826a4e72e97c0c6fd21642fa8048d556932c1291688e3e2e9d4245cbe9"} Mar 13 01:45:18.829425 master-0 kubenswrapper[19170]: I0313 01:45:18.829351 19170 generic.go:334] "Generic (PLEG): container finished" podID="65b35883-5cf2-442f-b296-139de330497c" containerID="d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be" exitCode=0 Mar 13 01:45:18.829425 master-0 kubenswrapper[19170]: I0313 01:45:18.829420 19170 generic.go:334] "Generic (PLEG): container finished" podID="65b35883-5cf2-442f-b296-139de330497c" containerID="c6ffc4b7e72283fb8122d4d8f8c25808e43b58e5fc99e95db2ecef6c012727c3" exitCode=143 Mar 13 01:45:18.829692 master-0 kubenswrapper[19170]: I0313 01:45:18.829456 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerDied","Data":"d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be"} Mar 13 01:45:18.829692 master-0 kubenswrapper[19170]: I0313 01:45:18.829503 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerDied","Data":"c6ffc4b7e72283fb8122d4d8f8c25808e43b58e5fc99e95db2ecef6c012727c3"} Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.833769 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.833967 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qqp\" (UniqueName: \"kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.834068 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.834166 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.834252 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.836540 master-0 kubenswrapper[19170]: I0313 01:45:18.834286 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.941654 master-0 kubenswrapper[19170]: I0313 01:45:18.941465 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.941654 master-0 kubenswrapper[19170]: I0313 01:45:18.941656 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.942326 master-0 kubenswrapper[19170]: I0313 01:45:18.941721 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.942326 master-0 kubenswrapper[19170]: I0313 01:45:18.941782 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qqp\" (UniqueName: \"kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.952093 master-0 kubenswrapper[19170]: I0313 01:45:18.951530 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.952989 master-0 kubenswrapper[19170]: I0313 01:45:18.952959 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.953249 master-0 kubenswrapper[19170]: I0313 01:45:18.953228 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.957409 master-0 kubenswrapper[19170]: I0313 01:45:18.956650 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.957409 master-0 kubenswrapper[19170]: I0313 01:45:18.956822 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.963926 master-0 kubenswrapper[19170]: I0313 01:45:18.963895 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.967228 master-0 kubenswrapper[19170]: I0313 01:45:18.967169 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qqp\" (UniqueName: \"kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:18.973481 master-0 kubenswrapper[19170]: I0313 01:45:18.973247 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys\") pod \"keystone-bootstrap-cx8d7\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:19.013165 master-0 kubenswrapper[19170]: I0313 01:45:19.013102 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:19.057478 master-0 kubenswrapper[19170]: I0313 01:45:19.057420 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:19.157878 master-0 kubenswrapper[19170]: I0313 01:45:19.156755 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.157878 master-0 kubenswrapper[19170]: I0313 01:45:19.156825 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c95ts\" (UniqueName: \"kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.157878 master-0 kubenswrapper[19170]: I0313 01:45:19.156940 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.157878 master-0 kubenswrapper[19170]: I0313 01:45:19.157252 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs" (OuterVolumeSpecName: "logs") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:19.157878 master-0 kubenswrapper[19170]: I0313 01:45:19.157328 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:19.159056 master-0 kubenswrapper[19170]: I0313 01:45:19.158949 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.159215 master-0 kubenswrapper[19170]: I0313 01:45:19.159035 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.159215 master-0 kubenswrapper[19170]: I0313 01:45:19.159173 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.159454 master-0 kubenswrapper[19170]: I0313 01:45:19.159375 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts\") pod \"65b35883-5cf2-442f-b296-139de330497c\" (UID: \"65b35883-5cf2-442f-b296-139de330497c\") " Mar 13 01:45:19.160463 master-0 kubenswrapper[19170]: I0313 01:45:19.160087 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.160463 master-0 kubenswrapper[19170]: I0313 01:45:19.160108 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65b35883-5cf2-442f-b296-139de330497c-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.160787 master-0 kubenswrapper[19170]: I0313 01:45:19.160742 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts" (OuterVolumeSpecName: "kube-api-access-c95ts") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "kube-api-access-c95ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:19.163316 master-0 kubenswrapper[19170]: I0313 01:45:19.163087 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts" (OuterVolumeSpecName: "scripts") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:19.204113 master-0 kubenswrapper[19170]: I0313 01:45:19.204044 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:19.209978 master-0 kubenswrapper[19170]: I0313 01:45:19.209918 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c" (OuterVolumeSpecName: "glance") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:45:19.261818 master-0 kubenswrapper[19170]: I0313 01:45:19.261702 19170 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") on node \"master-0\" " Mar 13 01:45:19.261818 master-0 kubenswrapper[19170]: I0313 01:45:19.261753 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.261818 master-0 kubenswrapper[19170]: I0313 01:45:19.261768 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.261818 master-0 kubenswrapper[19170]: I0313 01:45:19.261781 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c95ts\" (UniqueName: \"kubernetes.io/projected/65b35883-5cf2-442f-b296-139de330497c-kube-api-access-c95ts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.304545 master-0 kubenswrapper[19170]: I0313 01:45:19.290803 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data" (OuterVolumeSpecName: "config-data") pod "65b35883-5cf2-442f-b296-139de330497c" (UID: "65b35883-5cf2-442f-b296-139de330497c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:19.326146 master-0 kubenswrapper[19170]: I0313 01:45:19.325956 19170 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 01:45:19.326146 master-0 kubenswrapper[19170]: I0313 01:45:19.326115 19170 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb" (UniqueName: "kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c") on node "master-0" Mar 13 01:45:19.364666 master-0 kubenswrapper[19170]: I0313 01:45:19.364592 19170 reconciler_common.go:293] "Volume detached for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.364666 master-0 kubenswrapper[19170]: I0313 01:45:19.364626 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65b35883-5cf2-442f-b296-139de330497c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.367750 master-0 kubenswrapper[19170]: I0313 01:45:19.366904 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:45:19.449679 master-0 kubenswrapper[19170]: I0313 01:45:19.444269 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b60380b-c057-44ba-a4fa-e88d67bde60a" path="/var/lib/kubelet/pods/8b60380b-c057-44ba-a4fa-e88d67bde60a/volumes" Mar 13 01:45:19.582004 master-0 kubenswrapper[19170]: I0313 01:45:19.581941 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzwt7\" (UniqueName: \"kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.587404 master-0 kubenswrapper[19170]: I0313 01:45:19.582030 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.587404 master-0 kubenswrapper[19170]: I0313 01:45:19.582074 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.587404 master-0 kubenswrapper[19170]: I0313 01:45:19.582142 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.587404 master-0 kubenswrapper[19170]: I0313 01:45:19.582237 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.587404 master-0 kubenswrapper[19170]: I0313 01:45:19.582276 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc\") pod \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\" (UID: \"8679fcd5-4aee-4742-8f0f-fa761f7f5b88\") " Mar 13 01:45:19.588984 master-0 kubenswrapper[19170]: I0313 01:45:19.588938 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7" (OuterVolumeSpecName: "kube-api-access-jzwt7") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "kube-api-access-jzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:19.618716 master-0 kubenswrapper[19170]: I0313 01:45:19.618662 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cx8d7"] Mar 13 01:45:19.636834 master-0 kubenswrapper[19170]: W0313 01:45:19.636699 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode032a333_aa26_44b3_8874_6610d050833e.slice/crio-82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de WatchSource:0}: Error finding container 82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de: Status 404 returned error can't find the container with id 82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de Mar 13 01:45:19.640689 master-0 kubenswrapper[19170]: I0313 01:45:19.640613 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:19.661492 master-0 kubenswrapper[19170]: I0313 01:45:19.661440 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:19.670078 master-0 kubenswrapper[19170]: I0313 01:45:19.670031 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:19.673721 master-0 kubenswrapper[19170]: I0313 01:45:19.673621 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:19.685852 master-0 kubenswrapper[19170]: I0313 01:45:19.685802 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.685852 master-0 kubenswrapper[19170]: I0313 01:45:19.685841 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.685852 master-0 kubenswrapper[19170]: I0313 01:45:19.685859 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.686406 master-0 kubenswrapper[19170]: I0313 01:45:19.685871 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzwt7\" (UniqueName: \"kubernetes.io/projected/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-kube-api-access-jzwt7\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.686406 master-0 kubenswrapper[19170]: I0313 01:45:19.685882 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.700813 master-0 kubenswrapper[19170]: I0313 01:45:19.700712 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config" (OuterVolumeSpecName: "config") pod "8679fcd5-4aee-4742-8f0f-fa761f7f5b88" (UID: "8679fcd5-4aee-4742-8f0f-fa761f7f5b88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:19.788776 master-0 kubenswrapper[19170]: I0313 01:45:19.788707 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8679fcd5-4aee-4742-8f0f-fa761f7f5b88-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:19.859684 master-0 kubenswrapper[19170]: I0313 01:45:19.858086 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerStarted","Data":"b28be5b9ba285c4637e404d29769ea6b5c3a1352e7a608eae6639721d9670d97"} Mar 13 01:45:19.864704 master-0 kubenswrapper[19170]: I0313 01:45:19.860561 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cx8d7" event={"ID":"e032a333-aa26-44b3-8874-6610d050833e","Type":"ContainerStarted","Data":"82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de"} Mar 13 01:45:19.864704 master-0 kubenswrapper[19170]: I0313 01:45:19.863107 19170 generic.go:334] "Generic (PLEG): container finished" podID="e445697b-863d-4e96-8ccb-581dacecddcd" containerID="26c720c0b5d40fee814f609e5fa85619f7b58f50441d5ded3c73027f849fbba7" exitCode=0 Mar 13 01:45:19.864704 master-0 kubenswrapper[19170]: I0313 01:45:19.863170 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mmsc6" event={"ID":"e445697b-863d-4e96-8ccb-581dacecddcd","Type":"ContainerDied","Data":"26c720c0b5d40fee814f609e5fa85619f7b58f50441d5ded3c73027f849fbba7"} Mar 13 01:45:19.868684 master-0 kubenswrapper[19170]: I0313 01:45:19.868417 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:19.868684 master-0 kubenswrapper[19170]: I0313 01:45:19.868415 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"65b35883-5cf2-442f-b296-139de330497c","Type":"ContainerDied","Data":"005317ec2ac54ee5ebd6d223ea494e7e4095d87d48dc8d2a3b4fe2468c5481a5"} Mar 13 01:45:19.868684 master-0 kubenswrapper[19170]: I0313 01:45:19.868619 19170 scope.go:117] "RemoveContainer" containerID="d0964f804766d67708fd67f768828aff39d72de08f1d65a0966190537601f6be" Mar 13 01:45:19.873524 master-0 kubenswrapper[19170]: I0313 01:45:19.873405 19170 generic.go:334] "Generic (PLEG): container finished" podID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerID="f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35" exitCode=0 Mar 13 01:45:19.873524 master-0 kubenswrapper[19170]: I0313 01:45:19.873441 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" event={"ID":"8679fcd5-4aee-4742-8f0f-fa761f7f5b88","Type":"ContainerDied","Data":"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35"} Mar 13 01:45:19.873524 master-0 kubenswrapper[19170]: I0313 01:45:19.873464 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" event={"ID":"8679fcd5-4aee-4742-8f0f-fa761f7f5b88","Type":"ContainerDied","Data":"620aeb8240c3c5fd35392c7249faef1b781c228bde7aacd7f3a86e3b3f6af909"} Mar 13 01:45:19.873524 master-0 kubenswrapper[19170]: I0313 01:45:19.873512 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76986c7db5-cp72m" Mar 13 01:45:19.885268 master-0 kubenswrapper[19170]: I0313 01:45:19.883910 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cx8d7" podStartSLOduration=1.8838904250000001 podStartE2EDuration="1.883890425s" podCreationTimestamp="2026-03-13 01:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:19.881599871 +0000 UTC m=+1580.689720831" watchObservedRunningTime="2026-03-13 01:45:19.883890425 +0000 UTC m=+1580.692011385" Mar 13 01:45:19.954076 master-0 kubenswrapper[19170]: I0313 01:45:19.954005 19170 scope.go:117] "RemoveContainer" containerID="c6ffc4b7e72283fb8122d4d8f8c25808e43b58e5fc99e95db2ecef6c012727c3" Mar 13 01:45:19.985862 master-0 kubenswrapper[19170]: I0313 01:45:19.985810 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:20.007996 master-0 kubenswrapper[19170]: I0313 01:45:20.007938 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:20.037894 master-0 kubenswrapper[19170]: I0313 01:45:20.033041 19170 scope.go:117] "RemoveContainer" containerID="f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35" Mar 13 01:45:20.056453 master-0 kubenswrapper[19170]: I0313 01:45:20.056383 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:45:20.071521 master-0 kubenswrapper[19170]: I0313 01:45:20.071458 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76986c7db5-cp72m"] Mar 13 01:45:20.080213 master-0 kubenswrapper[19170]: I0313 01:45:20.080007 19170 scope.go:117] "RemoveContainer" containerID="8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84" Mar 13 01:45:20.091836 master-0 kubenswrapper[19170]: I0313 01:45:20.091798 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:20.092276 master-0 kubenswrapper[19170]: E0313 01:45:20.092253 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="dnsmasq-dns" Mar 13 01:45:20.092276 master-0 kubenswrapper[19170]: I0313 01:45:20.092271 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="dnsmasq-dns" Mar 13 01:45:20.092361 master-0 kubenswrapper[19170]: E0313 01:45:20.092288 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-log" Mar 13 01:45:20.092361 master-0 kubenswrapper[19170]: I0313 01:45:20.092297 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-log" Mar 13 01:45:20.092361 master-0 kubenswrapper[19170]: E0313 01:45:20.092350 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="init" Mar 13 01:45:20.092361 master-0 kubenswrapper[19170]: I0313 01:45:20.092357 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="init" Mar 13 01:45:20.092482 master-0 kubenswrapper[19170]: E0313 01:45:20.092380 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-httpd" Mar 13 01:45:20.092482 master-0 kubenswrapper[19170]: I0313 01:45:20.092387 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-httpd" Mar 13 01:45:20.092596 master-0 kubenswrapper[19170]: I0313 01:45:20.092569 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-log" Mar 13 01:45:20.092644 master-0 kubenswrapper[19170]: I0313 01:45:20.092611 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" containerName="dnsmasq-dns" Mar 13 01:45:20.092644 master-0 kubenswrapper[19170]: I0313 01:45:20.092626 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="65b35883-5cf2-442f-b296-139de330497c" containerName="glance-httpd" Mar 13 01:45:20.093830 master-0 kubenswrapper[19170]: I0313 01:45:20.093806 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.097616 master-0 kubenswrapper[19170]: I0313 01:45:20.097579 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 01:45:20.097815 master-0 kubenswrapper[19170]: I0313 01:45:20.097772 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-internal-config-data" Mar 13 01:45:20.105535 master-0 kubenswrapper[19170]: I0313 01:45:20.105283 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:20.106866 master-0 kubenswrapper[19170]: I0313 01:45:20.106716 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107624 master-0 kubenswrapper[19170]: I0313 01:45:20.107343 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107877 master-0 kubenswrapper[19170]: I0313 01:45:20.107805 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107877 master-0 kubenswrapper[19170]: I0313 01:45:20.107830 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107877 master-0 kubenswrapper[19170]: I0313 01:45:20.107860 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107971 master-0 kubenswrapper[19170]: I0313 01:45:20.107882 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.107971 master-0 kubenswrapper[19170]: I0313 01:45:20.107938 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s62hg\" (UniqueName: \"kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.108074 master-0 kubenswrapper[19170]: I0313 01:45:20.107982 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.168913 master-0 kubenswrapper[19170]: I0313 01:45:20.168143 19170 scope.go:117] "RemoveContainer" containerID="f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35" Mar 13 01:45:20.168913 master-0 kubenswrapper[19170]: E0313 01:45:20.168781 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35\": container with ID starting with f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35 not found: ID does not exist" containerID="f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35" Mar 13 01:45:20.168913 master-0 kubenswrapper[19170]: I0313 01:45:20.168827 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35"} err="failed to get container status \"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35\": rpc error: code = NotFound desc = could not find container \"f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35\": container with ID starting with f72d6958b3b63b4c7165dcb5dd5530fc00e3702e0113f143add503cb55582c35 not found: ID does not exist" Mar 13 01:45:20.171495 master-0 kubenswrapper[19170]: I0313 01:45:20.168858 19170 scope.go:117] "RemoveContainer" containerID="8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84" Mar 13 01:45:20.171495 master-0 kubenswrapper[19170]: E0313 01:45:20.169416 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84\": container with ID starting with 8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84 not found: ID does not exist" containerID="8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84" Mar 13 01:45:20.171495 master-0 kubenswrapper[19170]: I0313 01:45:20.169471 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84"} err="failed to get container status \"8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84\": rpc error: code = NotFound desc = could not find container \"8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84\": container with ID starting with 8d6c2c705ad73536235b05189413975bfd39f4ea5d81ee51093ad695ce664c84 not found: ID does not exist" Mar 13 01:45:20.210721 master-0 kubenswrapper[19170]: I0313 01:45:20.210450 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.210721 master-0 kubenswrapper[19170]: I0313 01:45:20.210694 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.211164 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s62hg\" (UniqueName: \"kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.211519 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.211784 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.211884 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.211931 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.212190 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.213608 master-0 kubenswrapper[19170]: I0313 01:45:20.212318 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.214445 master-0 kubenswrapper[19170]: I0313 01:45:20.214392 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.219087 master-0 kubenswrapper[19170]: I0313 01:45:20.219042 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:20.219176 master-0 kubenswrapper[19170]: I0313 01:45:20.219086 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/34cbbf4fc438e2d42966b02d86ef558a7de843a3eec70be4281d747be1ba2c15/globalmount\"" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.223689 master-0 kubenswrapper[19170]: I0313 01:45:20.223641 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.225461 master-0 kubenswrapper[19170]: I0313 01:45:20.225205 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.225550 master-0 kubenswrapper[19170]: I0313 01:45:20.225522 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.232518 master-0 kubenswrapper[19170]: I0313 01:45:20.232463 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s62hg\" (UniqueName: \"kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.242422 master-0 kubenswrapper[19170]: I0313 01:45:20.242377 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:20.901826 master-0 kubenswrapper[19170]: I0313 01:45:20.894391 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cx8d7" event={"ID":"e032a333-aa26-44b3-8874-6610d050833e","Type":"ContainerStarted","Data":"7b649789005ddb0275b6f67d1487d332af1e1f14a0f43dfc0a9f72dda7ad1631"} Mar 13 01:45:20.917451 master-0 kubenswrapper[19170]: I0313 01:45:20.917367 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-external-api-0" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-log" containerID="cri-o://b28be5b9ba285c4637e404d29769ea6b5c3a1352e7a608eae6639721d9670d97" gracePeriod=30 Mar 13 01:45:20.917667 master-0 kubenswrapper[19170]: I0313 01:45:20.917507 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerStarted","Data":"debc112def1df092ceb651a5bc98840adf2388fe005096c7b56ef2a250565893"} Mar 13 01:45:20.917989 master-0 kubenswrapper[19170]: I0313 01:45:20.917967 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-external-api-0" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-httpd" containerID="cri-o://debc112def1df092ceb651a5bc98840adf2388fe005096c7b56ef2a250565893" gracePeriod=30 Mar 13 01:45:21.060322 master-0 kubenswrapper[19170]: I0313 01:45:21.060238 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-external-api-0" podStartSLOduration=8.060218533 podStartE2EDuration="8.060218533s" podCreationTimestamp="2026-03-13 01:45:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:21.036219607 +0000 UTC m=+1581.844340587" watchObservedRunningTime="2026-03-13 01:45:21.060218533 +0000 UTC m=+1581.868339493" Mar 13 01:45:21.368167 master-0 kubenswrapper[19170]: I0313 01:45:21.367713 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:21.450939 master-0 kubenswrapper[19170]: I0313 01:45:21.450798 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b35883-5cf2-442f-b296-139de330497c" path="/var/lib/kubelet/pods/65b35883-5cf2-442f-b296-139de330497c/volumes" Mar 13 01:45:21.452229 master-0 kubenswrapper[19170]: I0313 01:45:21.452198 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8679fcd5-4aee-4742-8f0f-fa761f7f5b88" path="/var/lib/kubelet/pods/8679fcd5-4aee-4742-8f0f-fa761f7f5b88/volumes" Mar 13 01:45:21.517207 master-0 kubenswrapper[19170]: I0313 01:45:21.517157 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:21.548556 master-0 kubenswrapper[19170]: I0313 01:45:21.548512 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs\") pod \"e445697b-863d-4e96-8ccb-581dacecddcd\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " Mar 13 01:45:21.548757 master-0 kubenswrapper[19170]: I0313 01:45:21.548636 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle\") pod \"e445697b-863d-4e96-8ccb-581dacecddcd\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " Mar 13 01:45:21.548757 master-0 kubenswrapper[19170]: I0313 01:45:21.548753 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data\") pod \"e445697b-863d-4e96-8ccb-581dacecddcd\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " Mar 13 01:45:21.548870 master-0 kubenswrapper[19170]: I0313 01:45:21.548855 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts\") pod \"e445697b-863d-4e96-8ccb-581dacecddcd\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " Mar 13 01:45:21.548912 master-0 kubenswrapper[19170]: I0313 01:45:21.548903 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwz2g\" (UniqueName: \"kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g\") pod \"e445697b-863d-4e96-8ccb-581dacecddcd\" (UID: \"e445697b-863d-4e96-8ccb-581dacecddcd\") " Mar 13 01:45:21.549878 master-0 kubenswrapper[19170]: I0313 01:45:21.549841 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs" (OuterVolumeSpecName: "logs") pod "e445697b-863d-4e96-8ccb-581dacecddcd" (UID: "e445697b-863d-4e96-8ccb-581dacecddcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:21.553340 master-0 kubenswrapper[19170]: I0313 01:45:21.553293 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts" (OuterVolumeSpecName: "scripts") pod "e445697b-863d-4e96-8ccb-581dacecddcd" (UID: "e445697b-863d-4e96-8ccb-581dacecddcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:21.557668 master-0 kubenswrapper[19170]: I0313 01:45:21.557615 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g" (OuterVolumeSpecName: "kube-api-access-zwz2g") pod "e445697b-863d-4e96-8ccb-581dacecddcd" (UID: "e445697b-863d-4e96-8ccb-581dacecddcd"). InnerVolumeSpecName "kube-api-access-zwz2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:21.574017 master-0 kubenswrapper[19170]: I0313 01:45:21.573956 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data" (OuterVolumeSpecName: "config-data") pod "e445697b-863d-4e96-8ccb-581dacecddcd" (UID: "e445697b-863d-4e96-8ccb-581dacecddcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:21.577434 master-0 kubenswrapper[19170]: I0313 01:45:21.577397 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e445697b-863d-4e96-8ccb-581dacecddcd" (UID: "e445697b-863d-4e96-8ccb-581dacecddcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:21.652424 master-0 kubenswrapper[19170]: I0313 01:45:21.652354 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:21.652623 master-0 kubenswrapper[19170]: I0313 01:45:21.652437 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:21.652623 master-0 kubenswrapper[19170]: I0313 01:45:21.652467 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e445697b-863d-4e96-8ccb-581dacecddcd-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:21.652623 master-0 kubenswrapper[19170]: I0313 01:45:21.652493 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwz2g\" (UniqueName: \"kubernetes.io/projected/e445697b-863d-4e96-8ccb-581dacecddcd-kube-api-access-zwz2g\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:21.652623 master-0 kubenswrapper[19170]: I0313 01:45:21.652520 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e445697b-863d-4e96-8ccb-581dacecddcd-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:21.671775 master-0 kubenswrapper[19170]: I0313 01:45:21.671719 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:21.929072 master-0 kubenswrapper[19170]: I0313 01:45:21.929010 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mmsc6" event={"ID":"e445697b-863d-4e96-8ccb-581dacecddcd","Type":"ContainerDied","Data":"fffe11b304c370c5ab28ef903f2fa890b532d8e5e9a3e334c80a139a6a1b3721"} Mar 13 01:45:21.929072 master-0 kubenswrapper[19170]: I0313 01:45:21.929072 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fffe11b304c370c5ab28ef903f2fa890b532d8e5e9a3e334c80a139a6a1b3721" Mar 13 01:45:21.929353 master-0 kubenswrapper[19170]: I0313 01:45:21.929026 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mmsc6" Mar 13 01:45:21.932172 master-0 kubenswrapper[19170]: I0313 01:45:21.932123 19170 generic.go:334] "Generic (PLEG): container finished" podID="9f05049d-6514-404b-9823-7128787e6180" containerID="debc112def1df092ceb651a5bc98840adf2388fe005096c7b56ef2a250565893" exitCode=0 Mar 13 01:45:21.932254 master-0 kubenswrapper[19170]: I0313 01:45:21.932176 19170 generic.go:334] "Generic (PLEG): container finished" podID="9f05049d-6514-404b-9823-7128787e6180" containerID="b28be5b9ba285c4637e404d29769ea6b5c3a1352e7a608eae6639721d9670d97" exitCode=143 Mar 13 01:45:21.932254 master-0 kubenswrapper[19170]: I0313 01:45:21.932205 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerDied","Data":"debc112def1df092ceb651a5bc98840adf2388fe005096c7b56ef2a250565893"} Mar 13 01:45:21.932254 master-0 kubenswrapper[19170]: I0313 01:45:21.932245 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerDied","Data":"b28be5b9ba285c4637e404d29769ea6b5c3a1352e7a608eae6639721d9670d97"} Mar 13 01:45:22.159008 master-0 kubenswrapper[19170]: I0313 01:45:22.155802 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:45:22.159008 master-0 kubenswrapper[19170]: E0313 01:45:22.156322 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e445697b-863d-4e96-8ccb-581dacecddcd" containerName="placement-db-sync" Mar 13 01:45:22.159008 master-0 kubenswrapper[19170]: I0313 01:45:22.156337 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e445697b-863d-4e96-8ccb-581dacecddcd" containerName="placement-db-sync" Mar 13 01:45:22.159008 master-0 kubenswrapper[19170]: I0313 01:45:22.156541 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e445697b-863d-4e96-8ccb-581dacecddcd" containerName="placement-db-sync" Mar 13 01:45:22.159008 master-0 kubenswrapper[19170]: I0313 01:45:22.157751 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.164316 master-0 kubenswrapper[19170]: I0313 01:45:22.164278 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 01:45:22.164475 master-0 kubenswrapper[19170]: I0313 01:45:22.164426 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 01:45:22.164541 master-0 kubenswrapper[19170]: I0313 01:45:22.164494 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 01:45:22.165415 master-0 kubenswrapper[19170]: I0313 01:45:22.165388 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 01:45:22.187927 master-0 kubenswrapper[19170]: I0313 01:45:22.187792 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:45:22.270561 master-0 kubenswrapper[19170]: I0313 01:45:22.270518 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c426k\" (UniqueName: \"kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.270963 master-0 kubenswrapper[19170]: I0313 01:45:22.270945 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.271056 master-0 kubenswrapper[19170]: I0313 01:45:22.271039 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.271156 master-0 kubenswrapper[19170]: I0313 01:45:22.271143 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.271237 master-0 kubenswrapper[19170]: I0313 01:45:22.271223 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.271570 master-0 kubenswrapper[19170]: I0313 01:45:22.271554 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.271720 master-0 kubenswrapper[19170]: I0313 01:45:22.271704 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.373590 master-0 kubenswrapper[19170]: I0313 01:45:22.373531 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.373590 master-0 kubenswrapper[19170]: I0313 01:45:22.373591 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.374417 master-0 kubenswrapper[19170]: I0313 01:45:22.373632 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.374417 master-0 kubenswrapper[19170]: I0313 01:45:22.373650 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.374417 master-0 kubenswrapper[19170]: I0313 01:45:22.373688 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.374417 master-0 kubenswrapper[19170]: I0313 01:45:22.373741 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.374417 master-0 kubenswrapper[19170]: I0313 01:45:22.373832 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c426k\" (UniqueName: \"kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.375246 master-0 kubenswrapper[19170]: I0313 01:45:22.375175 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.377601 master-0 kubenswrapper[19170]: I0313 01:45:22.377537 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.377853 master-0 kubenswrapper[19170]: I0313 01:45:22.377833 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.378295 master-0 kubenswrapper[19170]: I0313 01:45:22.378195 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.378957 master-0 kubenswrapper[19170]: I0313 01:45:22.378886 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.383400 master-0 kubenswrapper[19170]: I0313 01:45:22.383359 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.394241 master-0 kubenswrapper[19170]: I0313 01:45:22.394184 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c426k\" (UniqueName: \"kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k\") pod \"placement-769f4cdbc8-8mz4m\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:22.513922 master-0 kubenswrapper[19170]: I0313 01:45:22.513792 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:30.658330 master-0 kubenswrapper[19170]: I0313 01:45:30.658104 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:30.781469 master-0 kubenswrapper[19170]: I0313 01:45:30.781380 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.781840 master-0 kubenswrapper[19170]: I0313 01:45:30.781810 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.781976 master-0 kubenswrapper[19170]: I0313 01:45:30.781951 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.782063 master-0 kubenswrapper[19170]: I0313 01:45:30.782040 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.782111 master-0 kubenswrapper[19170]: I0313 01:45:30.782072 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx898\" (UniqueName: \"kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.782250 master-0 kubenswrapper[19170]: I0313 01:45:30.782227 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.782287 master-0 kubenswrapper[19170]: I0313 01:45:30.782247 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run\") pod \"9f05049d-6514-404b-9823-7128787e6180\" (UID: \"9f05049d-6514-404b-9823-7128787e6180\") " Mar 13 01:45:30.782507 master-0 kubenswrapper[19170]: I0313 01:45:30.782456 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs" (OuterVolumeSpecName: "logs") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:30.782846 master-0 kubenswrapper[19170]: I0313 01:45:30.782816 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:30.785714 master-0 kubenswrapper[19170]: I0313 01:45:30.785671 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898" (OuterVolumeSpecName: "kube-api-access-sx898") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "kube-api-access-sx898". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:30.786474 master-0 kubenswrapper[19170]: I0313 01:45:30.786450 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts" (OuterVolumeSpecName: "scripts") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:30.786664 master-0 kubenswrapper[19170]: I0313 01:45:30.786614 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.786729 master-0 kubenswrapper[19170]: I0313 01:45:30.786666 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx898\" (UniqueName: \"kubernetes.io/projected/9f05049d-6514-404b-9823-7128787e6180-kube-api-access-sx898\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.786729 master-0 kubenswrapper[19170]: I0313 01:45:30.786679 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9f05049d-6514-404b-9823-7128787e6180-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.786729 master-0 kubenswrapper[19170]: I0313 01:45:30.786690 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.807435 master-0 kubenswrapper[19170]: I0313 01:45:30.807311 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb" (OuterVolumeSpecName: "glance") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "pvc-60c39735-cd4e-4baf-8a95-3babad891e79". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:45:30.812064 master-0 kubenswrapper[19170]: I0313 01:45:30.811975 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:30.844471 master-0 kubenswrapper[19170]: I0313 01:45:30.844400 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data" (OuterVolumeSpecName: "config-data") pod "9f05049d-6514-404b-9823-7128787e6180" (UID: "9f05049d-6514-404b-9823-7128787e6180"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:30.889588 master-0 kubenswrapper[19170]: I0313 01:45:30.889517 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.889805 master-0 kubenswrapper[19170]: I0313 01:45:30.889596 19170 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" " Mar 13 01:45:30.889805 master-0 kubenswrapper[19170]: I0313 01:45:30.889616 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f05049d-6514-404b-9823-7128787e6180-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:30.910339 master-0 kubenswrapper[19170]: I0313 01:45:30.910298 19170 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 01:45:30.910554 master-0 kubenswrapper[19170]: I0313 01:45:30.910454 19170 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-60c39735-cd4e-4baf-8a95-3babad891e79" (UniqueName: "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb") on node "master-0" Mar 13 01:45:30.992018 master-0 kubenswrapper[19170]: I0313 01:45:30.991963 19170 reconciler_common.go:293] "Volume detached for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:31.048921 master-0 kubenswrapper[19170]: I0313 01:45:31.048870 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"9f05049d-6514-404b-9823-7128787e6180","Type":"ContainerDied","Data":"b8354c826a4e72e97c0c6fd21642fa8048d556932c1291688e3e2e9d4245cbe9"} Mar 13 01:45:31.048921 master-0 kubenswrapper[19170]: I0313 01:45:31.048926 19170 scope.go:117] "RemoveContainer" containerID="debc112def1df092ceb651a5bc98840adf2388fe005096c7b56ef2a250565893" Mar 13 01:45:31.049140 master-0 kubenswrapper[19170]: I0313 01:45:31.049029 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.107983 master-0 kubenswrapper[19170]: I0313 01:45:31.107936 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:31.145569 master-0 kubenswrapper[19170]: I0313 01:45:31.145271 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:31.169276 master-0 kubenswrapper[19170]: I0313 01:45:31.168759 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:31.170791 master-0 kubenswrapper[19170]: E0313 01:45:31.169880 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-log" Mar 13 01:45:31.170791 master-0 kubenswrapper[19170]: I0313 01:45:31.169900 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-log" Mar 13 01:45:31.170791 master-0 kubenswrapper[19170]: E0313 01:45:31.169959 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-httpd" Mar 13 01:45:31.170791 master-0 kubenswrapper[19170]: I0313 01:45:31.169966 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-httpd" Mar 13 01:45:31.173706 master-0 kubenswrapper[19170]: I0313 01:45:31.171183 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-httpd" Mar 13 01:45:31.173706 master-0 kubenswrapper[19170]: I0313 01:45:31.171710 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f05049d-6514-404b-9823-7128787e6180" containerName="glance-log" Mar 13 01:45:31.178448 master-0 kubenswrapper[19170]: I0313 01:45:31.176523 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:31.178448 master-0 kubenswrapper[19170]: I0313 01:45:31.176684 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.179778 master-0 kubenswrapper[19170]: I0313 01:45:31.179737 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-external-config-data" Mar 13 01:45:31.180280 master-0 kubenswrapper[19170]: I0313 01:45:31.180234 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 01:45:31.301432 master-0 kubenswrapper[19170]: I0313 01:45:31.301369 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.301691 master-0 kubenswrapper[19170]: I0313 01:45:31.301522 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.301691 master-0 kubenswrapper[19170]: I0313 01:45:31.301572 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.301691 master-0 kubenswrapper[19170]: I0313 01:45:31.301615 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.301803 master-0 kubenswrapper[19170]: I0313 01:45:31.301734 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfhk\" (UniqueName: \"kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.301884 master-0 kubenswrapper[19170]: I0313 01:45:31.301856 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.302112 master-0 kubenswrapper[19170]: I0313 01:45:31.302085 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.302152 master-0 kubenswrapper[19170]: I0313 01:45:31.302136 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.403838 master-0 kubenswrapper[19170]: I0313 01:45:31.403718 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404039 master-0 kubenswrapper[19170]: I0313 01:45:31.403901 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404039 master-0 kubenswrapper[19170]: I0313 01:45:31.403929 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404161 master-0 kubenswrapper[19170]: I0313 01:45:31.404110 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfhk\" (UniqueName: \"kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404262 master-0 kubenswrapper[19170]: I0313 01:45:31.404238 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404365 master-0 kubenswrapper[19170]: I0313 01:45:31.404341 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404412 master-0 kubenswrapper[19170]: I0313 01:45:31.404335 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404476 master-0 kubenswrapper[19170]: I0313 01:45:31.404456 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.404519 master-0 kubenswrapper[19170]: I0313 01:45:31.404501 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.406230 master-0 kubenswrapper[19170]: I0313 01:45:31.406190 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.423548 master-0 kubenswrapper[19170]: I0313 01:45:31.423501 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.423671 master-0 kubenswrapper[19170]: I0313 01:45:31.423590 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.423868 master-0 kubenswrapper[19170]: I0313 01:45:31.423831 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:31.423907 master-0 kubenswrapper[19170]: I0313 01:45:31.423882 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/dc901c00d5023825c7bc5ea32a35825a63b53b24b9433470d442fe02f9c29cd0/globalmount\"" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.424110 master-0 kubenswrapper[19170]: I0313 01:45:31.424068 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.424438 master-0 kubenswrapper[19170]: I0313 01:45:31.424395 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.429714 master-0 kubenswrapper[19170]: I0313 01:45:31.429672 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfhk\" (UniqueName: \"kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:31.441900 master-0 kubenswrapper[19170]: I0313 01:45:31.441855 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f05049d-6514-404b-9823-7128787e6180" path="/var/lib/kubelet/pods/9f05049d-6514-404b-9823-7128787e6180/volumes" Mar 13 01:45:32.768038 master-0 kubenswrapper[19170]: I0313 01:45:32.767976 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:32.837897 master-0 kubenswrapper[19170]: I0313 01:45:32.837832 19170 scope.go:117] "RemoveContainer" containerID="b28be5b9ba285c4637e404d29769ea6b5c3a1352e7a608eae6639721d9670d97" Mar 13 01:45:33.005779 master-0 kubenswrapper[19170]: I0313 01:45:33.005731 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:33.076759 master-0 kubenswrapper[19170]: I0313 01:45:33.076703 19170 generic.go:334] "Generic (PLEG): container finished" podID="e032a333-aa26-44b3-8874-6610d050833e" containerID="7b649789005ddb0275b6f67d1487d332af1e1f14a0f43dfc0a9f72dda7ad1631" exitCode=0 Mar 13 01:45:33.076967 master-0 kubenswrapper[19170]: I0313 01:45:33.076773 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cx8d7" event={"ID":"e032a333-aa26-44b3-8874-6610d050833e","Type":"ContainerDied","Data":"7b649789005ddb0275b6f67d1487d332af1e1f14a0f43dfc0a9f72dda7ad1631"} Mar 13 01:45:34.587989 master-0 kubenswrapper[19170]: I0313 01:45:34.587945 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:34.678401 master-0 kubenswrapper[19170]: I0313 01:45:34.678355 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.678401 master-0 kubenswrapper[19170]: I0313 01:45:34.678408 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.678667 master-0 kubenswrapper[19170]: I0313 01:45:34.678472 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.678667 master-0 kubenswrapper[19170]: I0313 01:45:34.678575 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.678770 master-0 kubenswrapper[19170]: I0313 01:45:34.678744 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.678824 master-0 kubenswrapper[19170]: I0313 01:45:34.678792 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qqp\" (UniqueName: \"kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp\") pod \"e032a333-aa26-44b3-8874-6610d050833e\" (UID: \"e032a333-aa26-44b3-8874-6610d050833e\") " Mar 13 01:45:34.685281 master-0 kubenswrapper[19170]: I0313 01:45:34.684954 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts" (OuterVolumeSpecName: "scripts") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:34.690017 master-0 kubenswrapper[19170]: I0313 01:45:34.689939 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp" (OuterVolumeSpecName: "kube-api-access-d9qqp") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "kube-api-access-d9qqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:34.690907 master-0 kubenswrapper[19170]: I0313 01:45:34.690867 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:34.693432 master-0 kubenswrapper[19170]: I0313 01:45:34.693351 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:34.709001 master-0 kubenswrapper[19170]: I0313 01:45:34.708936 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data" (OuterVolumeSpecName: "config-data") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:34.716138 master-0 kubenswrapper[19170]: I0313 01:45:34.716081 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e032a333-aa26-44b3-8874-6610d050833e" (UID: "e032a333-aa26-44b3-8874-6610d050833e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781354 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781418 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781428 19170 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781436 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781444 19170 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e032a333-aa26-44b3-8874-6610d050833e-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.781427 master-0 kubenswrapper[19170]: I0313 01:45:34.781453 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qqp\" (UniqueName: \"kubernetes.io/projected/e032a333-aa26-44b3-8874-6610d050833e-kube-api-access-d9qqp\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:34.853919 master-0 kubenswrapper[19170]: I0313 01:45:34.848707 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:45:34.922474 master-0 kubenswrapper[19170]: I0313 01:45:34.922415 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:45:35.037165 master-0 kubenswrapper[19170]: I0313 01:45:35.037117 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:45:35.037843 master-0 kubenswrapper[19170]: W0313 01:45:35.037814 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979e6bc4_2aa2_4326_b7f2_c45f50b41c28.slice/crio-e02e3404f42e9542570be268f76aa082a1bc70e35e1e9b025dfba6bbec1ce0f7 WatchSource:0}: Error finding container e02e3404f42e9542570be268f76aa082a1bc70e35e1e9b025dfba6bbec1ce0f7: Status 404 returned error can't find the container with id e02e3404f42e9542570be268f76aa082a1bc70e35e1e9b025dfba6bbec1ce0f7 Mar 13 01:45:35.100751 master-0 kubenswrapper[19170]: I0313 01:45:35.100701 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerStarted","Data":"8ed4b668ae7343ab5625ddc5e12a1073ba8b3211e8548abb39d41a344a4a0cd8"} Mar 13 01:45:35.105361 master-0 kubenswrapper[19170]: I0313 01:45:35.105331 19170 generic.go:334] "Generic (PLEG): container finished" podID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerID="519082719ba0fa74b658a87c7972a090312c92d36f114e54b290253b29b3375b" exitCode=0 Mar 13 01:45:35.105511 master-0 kubenswrapper[19170]: I0313 01:45:35.105492 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-l9ds6" event={"ID":"dc3c71f3-cecf-4916-8841-0e557aad23d6","Type":"ContainerDied","Data":"519082719ba0fa74b658a87c7972a090312c92d36f114e54b290253b29b3375b"} Mar 13 01:45:35.108601 master-0 kubenswrapper[19170]: I0313 01:45:35.108557 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerStarted","Data":"e02e3404f42e9542570be268f76aa082a1bc70e35e1e9b025dfba6bbec1ce0f7"} Mar 13 01:45:35.123401 master-0 kubenswrapper[19170]: I0313 01:45:35.123342 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerStarted","Data":"eada95478f376b51f72504d808ebcc6bb95b656dd0f9a7257ed10e59c9d8f2f5"} Mar 13 01:45:35.123598 master-0 kubenswrapper[19170]: I0313 01:45:35.123409 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerStarted","Data":"41dae79279507ab750e879bdb1a3aed5eafacd96cfced3e8d249a548a1a296ad"} Mar 13 01:45:35.141728 master-0 kubenswrapper[19170]: I0313 01:45:35.130119 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-db-sync-5fc74" event={"ID":"7dbc43a0-77ee-41aa-87b2-730586a6fae4","Type":"ContainerStarted","Data":"4dcf91abc5e80dc29b12424d96606b7c2592ac7ba11ee256455f262200ef68f4"} Mar 13 01:45:35.141938 master-0 kubenswrapper[19170]: I0313 01:45:35.141898 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cx8d7" event={"ID":"e032a333-aa26-44b3-8874-6610d050833e","Type":"ContainerDied","Data":"82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de"} Mar 13 01:45:35.143380 master-0 kubenswrapper[19170]: I0313 01:45:35.141945 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82d850814d922c65a84caae3e8626bee3bdd020117951e31cf613f879de5d5de" Mar 13 01:45:35.143380 master-0 kubenswrapper[19170]: I0313 01:45:35.142261 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cx8d7" Mar 13 01:45:35.170198 master-0 kubenswrapper[19170]: I0313 01:45:35.170081 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-db-sync-5fc74" podStartSLOduration=3.793404111 podStartE2EDuration="29.170058893s" podCreationTimestamp="2026-03-13 01:45:06 +0000 UTC" firstStartedPulling="2026-03-13 01:45:08.882450023 +0000 UTC m=+1569.690570983" lastFinishedPulling="2026-03-13 01:45:34.259104795 +0000 UTC m=+1595.067225765" observedRunningTime="2026-03-13 01:45:35.152740695 +0000 UTC m=+1595.960861675" watchObservedRunningTime="2026-03-13 01:45:35.170058893 +0000 UTC m=+1595.978179863" Mar 13 01:45:35.243314 master-0 kubenswrapper[19170]: I0313 01:45:35.243189 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bb5b98c8c-485mq"] Mar 13 01:45:35.243792 master-0 kubenswrapper[19170]: E0313 01:45:35.243764 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e032a333-aa26-44b3-8874-6610d050833e" containerName="keystone-bootstrap" Mar 13 01:45:35.243792 master-0 kubenswrapper[19170]: I0313 01:45:35.243789 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e032a333-aa26-44b3-8874-6610d050833e" containerName="keystone-bootstrap" Mar 13 01:45:35.244138 master-0 kubenswrapper[19170]: I0313 01:45:35.244114 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e032a333-aa26-44b3-8874-6610d050833e" containerName="keystone-bootstrap" Mar 13 01:45:35.255522 master-0 kubenswrapper[19170]: I0313 01:45:35.255458 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.263916 master-0 kubenswrapper[19170]: I0313 01:45:35.263197 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 01:45:35.263916 master-0 kubenswrapper[19170]: I0313 01:45:35.263478 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 01:45:35.263916 master-0 kubenswrapper[19170]: I0313 01:45:35.263603 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 01:45:35.263916 master-0 kubenswrapper[19170]: I0313 01:45:35.263725 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 01:45:35.263916 master-0 kubenswrapper[19170]: I0313 01:45:35.263823 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 01:45:35.268661 master-0 kubenswrapper[19170]: I0313 01:45:35.268050 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bb5b98c8c-485mq"] Mar 13 01:45:35.402930 master-0 kubenswrapper[19170]: I0313 01:45:35.402820 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-fernet-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.403240 master-0 kubenswrapper[19170]: I0313 01:45:35.403224 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-credential-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.403377 master-0 kubenswrapper[19170]: I0313 01:45:35.403363 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp9v4\" (UniqueName: \"kubernetes.io/projected/883d32b4-eae8-461a-a021-1cbd1be9291a-kube-api-access-cp9v4\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.403563 master-0 kubenswrapper[19170]: I0313 01:45:35.403548 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-public-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.403716 master-0 kubenswrapper[19170]: I0313 01:45:35.403683 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-internal-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.404752 master-0 kubenswrapper[19170]: I0313 01:45:35.404694 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-config-data\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.404834 master-0 kubenswrapper[19170]: I0313 01:45:35.404768 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-scripts\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.405017 master-0 kubenswrapper[19170]: I0313 01:45:35.404983 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-combined-ca-bundle\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507319 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-config-data\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507408 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-scripts\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507468 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-combined-ca-bundle\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507495 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-fernet-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507668 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-credential-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507710 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp9v4\" (UniqueName: \"kubernetes.io/projected/883d32b4-eae8-461a-a021-1cbd1be9291a-kube-api-access-cp9v4\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507792 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-public-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.508668 master-0 kubenswrapper[19170]: I0313 01:45:35.507819 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-internal-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.512594 master-0 kubenswrapper[19170]: I0313 01:45:35.512270 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-config-data\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.513085 master-0 kubenswrapper[19170]: I0313 01:45:35.512790 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-public-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.520800 master-0 kubenswrapper[19170]: I0313 01:45:35.513296 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-scripts\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.520800 master-0 kubenswrapper[19170]: I0313 01:45:35.515438 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-combined-ca-bundle\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.520800 master-0 kubenswrapper[19170]: I0313 01:45:35.515808 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-internal-tls-certs\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.524172 master-0 kubenswrapper[19170]: I0313 01:45:35.523390 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp9v4\" (UniqueName: \"kubernetes.io/projected/883d32b4-eae8-461a-a021-1cbd1be9291a-kube-api-access-cp9v4\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.524172 master-0 kubenswrapper[19170]: I0313 01:45:35.524098 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-credential-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.528504 master-0 kubenswrapper[19170]: I0313 01:45:35.528444 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883d32b4-eae8-461a-a021-1cbd1be9291a-fernet-keys\") pod \"keystone-5bb5b98c8c-485mq\" (UID: \"883d32b4-eae8-461a-a021-1cbd1be9291a\") " pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:35.680986 master-0 kubenswrapper[19170]: I0313 01:45:35.678355 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:36.153979 master-0 kubenswrapper[19170]: I0313 01:45:36.152811 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bb5b98c8c-485mq"] Mar 13 01:45:36.185871 master-0 kubenswrapper[19170]: I0313 01:45:36.185789 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerStarted","Data":"5e01aa9d899d75b739283495ae642b747304de2ceb477f26cbcca8b0085e0c92"} Mar 13 01:45:36.198840 master-0 kubenswrapper[19170]: I0313 01:45:36.198779 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerStarted","Data":"c5e6417ef0054652c8fb4b74c37596118e7448530a224c9701dcdb46b0eaaeb8"} Mar 13 01:45:36.199224 master-0 kubenswrapper[19170]: I0313 01:45:36.199196 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:36.199224 master-0 kubenswrapper[19170]: I0313 01:45:36.199221 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:36.209803 master-0 kubenswrapper[19170]: I0313 01:45:36.209749 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerStarted","Data":"c632611af070ea18c1e344e383398eb132439553e422bc44171a9cf4e56b8ec1"} Mar 13 01:45:36.214612 master-0 kubenswrapper[19170]: I0313 01:45:36.214566 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-l9ds6" event={"ID":"dc3c71f3-cecf-4916-8841-0e557aad23d6","Type":"ContainerStarted","Data":"06ea23d8a95a4410decaf530dfff4e7d87e2ccde7b2a68c947ff45b405941a71"} Mar 13 01:45:36.248991 master-0 kubenswrapper[19170]: I0313 01:45:36.248923 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-769f4cdbc8-8mz4m" podStartSLOduration=14.248901503 podStartE2EDuration="14.248901503s" podCreationTimestamp="2026-03-13 01:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:36.220320268 +0000 UTC m=+1597.028441238" watchObservedRunningTime="2026-03-13 01:45:36.248901503 +0000 UTC m=+1597.057022463" Mar 13 01:45:36.257417 master-0 kubenswrapper[19170]: I0313 01:45:36.257350 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-l9ds6" podStartSLOduration=3.34238544 podStartE2EDuration="19.257332991s" podCreationTimestamp="2026-03-13 01:45:17 +0000 UTC" firstStartedPulling="2026-03-13 01:45:18.352468918 +0000 UTC m=+1579.160589908" lastFinishedPulling="2026-03-13 01:45:34.267416499 +0000 UTC m=+1595.075537459" observedRunningTime="2026-03-13 01:45:36.246448434 +0000 UTC m=+1597.054569394" watchObservedRunningTime="2026-03-13 01:45:36.257332991 +0000 UTC m=+1597.065453951" Mar 13 01:45:37.232043 master-0 kubenswrapper[19170]: I0313 01:45:37.231477 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerStarted","Data":"576b62567482197d09934570c36b691fedbd52b01043c1e3d7665d2eada20ed5"} Mar 13 01:45:37.235299 master-0 kubenswrapper[19170]: I0313 01:45:37.235159 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bb5b98c8c-485mq" event={"ID":"883d32b4-eae8-461a-a021-1cbd1be9291a","Type":"ContainerStarted","Data":"ec9107c88fa82f8b1003a6f6290f15a4715d19c81312ccdf41605109d951233a"} Mar 13 01:45:37.235299 master-0 kubenswrapper[19170]: I0313 01:45:37.235254 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bb5b98c8c-485mq" event={"ID":"883d32b4-eae8-461a-a021-1cbd1be9291a","Type":"ContainerStarted","Data":"c8ea74f3b8be56699816d915b35570cc0dcfe9c848a9c439e25095b1fe3a131f"} Mar 13 01:45:37.236448 master-0 kubenswrapper[19170]: I0313 01:45:37.235348 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:45:37.238973 master-0 kubenswrapper[19170]: I0313 01:45:37.238856 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerStarted","Data":"0c2095656099b3b45238a1bf4e00a44679bb9446b938e58a90f59baf2f39dfba"} Mar 13 01:45:37.268364 master-0 kubenswrapper[19170]: I0313 01:45:37.268239 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-internal-api-0" podStartSLOduration=18.268212656 podStartE2EDuration="18.268212656s" podCreationTimestamp="2026-03-13 01:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:37.264627175 +0000 UTC m=+1598.072748155" watchObservedRunningTime="2026-03-13 01:45:37.268212656 +0000 UTC m=+1598.076333626" Mar 13 01:45:37.339660 master-0 kubenswrapper[19170]: I0313 01:45:37.338930 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bb5b98c8c-485mq" podStartSLOduration=2.338908508 podStartE2EDuration="2.338908508s" podCreationTimestamp="2026-03-13 01:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:37.309076418 +0000 UTC m=+1598.117197378" watchObservedRunningTime="2026-03-13 01:45:37.338908508 +0000 UTC m=+1598.147029478" Mar 13 01:45:37.358813 master-0 kubenswrapper[19170]: I0313 01:45:37.357872 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-external-api-0" podStartSLOduration=6.357847402 podStartE2EDuration="6.357847402s" podCreationTimestamp="2026-03-13 01:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:37.338825886 +0000 UTC m=+1598.146946846" watchObservedRunningTime="2026-03-13 01:45:37.357847402 +0000 UTC m=+1598.165968372" Mar 13 01:45:41.673037 master-0 kubenswrapper[19170]: I0313 01:45:41.672928 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:41.673037 master-0 kubenswrapper[19170]: I0313 01:45:41.673023 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:41.707093 master-0 kubenswrapper[19170]: I0313 01:45:41.707007 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:41.717688 master-0 kubenswrapper[19170]: I0313 01:45:41.717614 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:42.323945 master-0 kubenswrapper[19170]: I0313 01:45:42.323890 19170 generic.go:334] "Generic (PLEG): container finished" podID="fe1a1216-7989-411d-8b62-6d12fefcc8ae" containerID="7c9a685dd5818a5aa39096c57a07cd4601ed40fd49b319969ad79d4826f591b7" exitCode=0 Mar 13 01:45:42.325181 master-0 kubenswrapper[19170]: I0313 01:45:42.325110 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmwsf" event={"ID":"fe1a1216-7989-411d-8b62-6d12fefcc8ae","Type":"ContainerDied","Data":"7c9a685dd5818a5aa39096c57a07cd4601ed40fd49b319969ad79d4826f591b7"} Mar 13 01:45:42.325287 master-0 kubenswrapper[19170]: I0313 01:45:42.325203 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:42.325287 master-0 kubenswrapper[19170]: I0313 01:45:42.325221 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:43.005963 master-0 kubenswrapper[19170]: I0313 01:45:43.005888 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.005963 master-0 kubenswrapper[19170]: I0313 01:45:43.005957 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.038786 master-0 kubenswrapper[19170]: I0313 01:45:43.038692 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.059318 master-0 kubenswrapper[19170]: I0313 01:45:43.058165 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.334745 master-0 kubenswrapper[19170]: I0313 01:45:43.334041 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.334745 master-0 kubenswrapper[19170]: I0313 01:45:43.334091 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:43.816784 master-0 kubenswrapper[19170]: I0313 01:45:43.816728 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:43.939056 master-0 kubenswrapper[19170]: I0313 01:45:43.938992 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config\") pod \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " Mar 13 01:45:43.939283 master-0 kubenswrapper[19170]: I0313 01:45:43.939150 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle\") pod \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " Mar 13 01:45:43.939283 master-0 kubenswrapper[19170]: I0313 01:45:43.939212 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcb4q\" (UniqueName: \"kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q\") pod \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\" (UID: \"fe1a1216-7989-411d-8b62-6d12fefcc8ae\") " Mar 13 01:45:43.946613 master-0 kubenswrapper[19170]: I0313 01:45:43.942682 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q" (OuterVolumeSpecName: "kube-api-access-pcb4q") pod "fe1a1216-7989-411d-8b62-6d12fefcc8ae" (UID: "fe1a1216-7989-411d-8b62-6d12fefcc8ae"). InnerVolumeSpecName "kube-api-access-pcb4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:43.965919 master-0 kubenswrapper[19170]: I0313 01:45:43.965864 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe1a1216-7989-411d-8b62-6d12fefcc8ae" (UID: "fe1a1216-7989-411d-8b62-6d12fefcc8ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:43.966319 master-0 kubenswrapper[19170]: I0313 01:45:43.966274 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config" (OuterVolumeSpecName: "config") pod "fe1a1216-7989-411d-8b62-6d12fefcc8ae" (UID: "fe1a1216-7989-411d-8b62-6d12fefcc8ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:44.046677 master-0 kubenswrapper[19170]: I0313 01:45:44.046071 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:44.046677 master-0 kubenswrapper[19170]: I0313 01:45:44.046115 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe1a1216-7989-411d-8b62-6d12fefcc8ae-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:44.046677 master-0 kubenswrapper[19170]: I0313 01:45:44.046126 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcb4q\" (UniqueName: \"kubernetes.io/projected/fe1a1216-7989-411d-8b62-6d12fefcc8ae-kube-api-access-pcb4q\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:44.344341 master-0 kubenswrapper[19170]: I0313 01:45:44.344210 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hmwsf" event={"ID":"fe1a1216-7989-411d-8b62-6d12fefcc8ae","Type":"ContainerDied","Data":"d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942"} Mar 13 01:45:44.344341 master-0 kubenswrapper[19170]: I0313 01:45:44.344274 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1b61caa55cdd875c3acc60560406d760cb6a3e28ccf87c877019df03450b942" Mar 13 01:45:44.344679 master-0 kubenswrapper[19170]: I0313 01:45:44.344437 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hmwsf" Mar 13 01:45:44.344679 master-0 kubenswrapper[19170]: I0313 01:45:44.344524 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:45:44.344679 master-0 kubenswrapper[19170]: I0313 01:45:44.344579 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:45:44.666362 master-0 kubenswrapper[19170]: I0313 01:45:44.658768 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:44.666362 master-0 kubenswrapper[19170]: E0313 01:45:44.659541 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1a1216-7989-411d-8b62-6d12fefcc8ae" containerName="neutron-db-sync" Mar 13 01:45:44.666362 master-0 kubenswrapper[19170]: I0313 01:45:44.659557 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1a1216-7989-411d-8b62-6d12fefcc8ae" containerName="neutron-db-sync" Mar 13 01:45:44.666362 master-0 kubenswrapper[19170]: I0313 01:45:44.659836 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1a1216-7989-411d-8b62-6d12fefcc8ae" containerName="neutron-db-sync" Mar 13 01:45:44.667617 master-0 kubenswrapper[19170]: I0313 01:45:44.666750 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.680700 master-0 kubenswrapper[19170]: I0313 01:45:44.679240 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762135 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762310 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762358 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq598\" (UniqueName: \"kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762408 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762439 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.765755 master-0 kubenswrapper[19170]: I0313 01:45:44.762459 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.815089 master-0 kubenswrapper[19170]: I0313 01:45:44.811574 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:45:44.815089 master-0 kubenswrapper[19170]: I0313 01:45:44.814138 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.818700 master-0 kubenswrapper[19170]: I0313 01:45:44.817079 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 01:45:44.818700 master-0 kubenswrapper[19170]: I0313 01:45:44.817335 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 01:45:44.818700 master-0 kubenswrapper[19170]: I0313 01:45:44.817949 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 01:45:44.823287 master-0 kubenswrapper[19170]: I0313 01:45:44.823172 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:45:44.864151 master-0 kubenswrapper[19170]: I0313 01:45:44.864107 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864151 master-0 kubenswrapper[19170]: I0313 01:45:44.864195 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq598\" (UniqueName: \"kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864244 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864271 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864290 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864322 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74m6d\" (UniqueName: \"kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864348 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864365 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864405 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864434 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.864484 master-0 kubenswrapper[19170]: I0313 01:45:44.864477 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.865349 master-0 kubenswrapper[19170]: I0313 01:45:44.865315 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.866698 master-0 kubenswrapper[19170]: I0313 01:45:44.866228 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.867481 master-0 kubenswrapper[19170]: I0313 01:45:44.867017 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.867481 master-0 kubenswrapper[19170]: I0313 01:45:44.867142 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.875143 master-0 kubenswrapper[19170]: I0313 01:45:44.875080 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.886891 master-0 kubenswrapper[19170]: I0313 01:45:44.883373 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq598\" (UniqueName: \"kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598\") pod \"dnsmasq-dns-5df4f6b69c-hxt7t\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:44.967922 master-0 kubenswrapper[19170]: I0313 01:45:44.967848 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74m6d\" (UniqueName: \"kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.967922 master-0 kubenswrapper[19170]: I0313 01:45:44.967927 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.968160 master-0 kubenswrapper[19170]: I0313 01:45:44.967949 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.968160 master-0 kubenswrapper[19170]: I0313 01:45:44.968003 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.968160 master-0 kubenswrapper[19170]: I0313 01:45:44.968051 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.975375 master-0 kubenswrapper[19170]: I0313 01:45:44.975312 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.979284 master-0 kubenswrapper[19170]: I0313 01:45:44.978872 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.985510 master-0 kubenswrapper[19170]: I0313 01:45:44.985465 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.986050 master-0 kubenswrapper[19170]: I0313 01:45:44.986023 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:44.992997 master-0 kubenswrapper[19170]: I0313 01:45:44.992799 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74m6d\" (UniqueName: \"kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d\") pod \"neutron-5df95cbc76-9kdvv\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:45.023216 master-0 kubenswrapper[19170]: I0313 01:45:45.022427 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:45.035201 master-0 kubenswrapper[19170]: I0313 01:45:45.035118 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:45.137524 master-0 kubenswrapper[19170]: I0313 01:45:45.137117 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:45:45.192832 master-0 kubenswrapper[19170]: I0313 01:45:45.192212 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:45.441658 master-0 kubenswrapper[19170]: I0313 01:45:45.412300 19170 generic.go:334] "Generic (PLEG): container finished" podID="7dbc43a0-77ee-41aa-87b2-730586a6fae4" containerID="4dcf91abc5e80dc29b12424d96606b7c2592ac7ba11ee256455f262200ef68f4" exitCode=0 Mar 13 01:45:45.441658 master-0 kubenswrapper[19170]: I0313 01:45:45.413229 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-db-sync-5fc74" event={"ID":"7dbc43a0-77ee-41aa-87b2-730586a6fae4","Type":"ContainerDied","Data":"4dcf91abc5e80dc29b12424d96606b7c2592ac7ba11ee256455f262200ef68f4"} Mar 13 01:45:46.035390 master-0 kubenswrapper[19170]: I0313 01:45:46.035326 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:46.035612 master-0 kubenswrapper[19170]: I0313 01:45:46.035496 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:45:46.119661 master-0 kubenswrapper[19170]: I0313 01:45:46.115702 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:46.155655 master-0 kubenswrapper[19170]: I0313 01:45:46.155261 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:45:46.260044 master-0 kubenswrapper[19170]: I0313 01:45:46.259835 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:45:46.281667 master-0 kubenswrapper[19170]: W0313 01:45:46.281598 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28caadd7_b003_4cb7_83d7_508a641ad738.slice/crio-87572d1e75373b253b42c5ac1f13c35d9b32c2d624682ebc05df8a24aa9957f5 WatchSource:0}: Error finding container 87572d1e75373b253b42c5ac1f13c35d9b32c2d624682ebc05df8a24aa9957f5: Status 404 returned error can't find the container with id 87572d1e75373b253b42c5ac1f13c35d9b32c2d624682ebc05df8a24aa9957f5 Mar 13 01:45:46.427837 master-0 kubenswrapper[19170]: I0313 01:45:46.427766 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerStarted","Data":"87572d1e75373b253b42c5ac1f13c35d9b32c2d624682ebc05df8a24aa9957f5"} Mar 13 01:45:46.432970 master-0 kubenswrapper[19170]: I0313 01:45:46.431141 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerStarted","Data":"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885"} Mar 13 01:45:46.432970 master-0 kubenswrapper[19170]: I0313 01:45:46.431174 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerStarted","Data":"52e2d48a0dd58ee1bd8acfd60ff5995404c04f19756b29167e2534c3a1ca1547"} Mar 13 01:45:46.978503 master-0 kubenswrapper[19170]: I0313 01:45:46.978464 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:47.081866 master-0 kubenswrapper[19170]: I0313 01:45:47.081601 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.081866 master-0 kubenswrapper[19170]: I0313 01:45:47.081708 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.081866 master-0 kubenswrapper[19170]: I0313 01:45:47.081747 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgd56\" (UniqueName: \"kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.081866 master-0 kubenswrapper[19170]: I0313 01:45:47.081793 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.081866 master-0 kubenswrapper[19170]: I0313 01:45:47.081832 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.082236 master-0 kubenswrapper[19170]: I0313 01:45:47.081917 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data\") pod \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\" (UID: \"7dbc43a0-77ee-41aa-87b2-730586a6fae4\") " Mar 13 01:45:47.087873 master-0 kubenswrapper[19170]: I0313 01:45:47.087821 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts" (OuterVolumeSpecName: "scripts") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:47.088845 master-0 kubenswrapper[19170]: I0313 01:45:47.088789 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56" (OuterVolumeSpecName: "kube-api-access-pgd56") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "kube-api-access-pgd56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:47.088916 master-0 kubenswrapper[19170]: I0313 01:45:47.088857 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:45:47.101345 master-0 kubenswrapper[19170]: I0313 01:45:47.101284 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:47.161377 master-0 kubenswrapper[19170]: I0313 01:45:47.160758 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:47.181645 master-0 kubenswrapper[19170]: I0313 01:45:47.180805 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data" (OuterVolumeSpecName: "config-data") pod "7dbc43a0-77ee-41aa-87b2-730586a6fae4" (UID: "7dbc43a0-77ee-41aa-87b2-730586a6fae4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184239 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184280 19170 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184295 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgd56\" (UniqueName: \"kubernetes.io/projected/7dbc43a0-77ee-41aa-87b2-730586a6fae4-kube-api-access-pgd56\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184305 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184313 19170 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dbc43a0-77ee-41aa-87b2-730586a6fae4-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.185245 master-0 kubenswrapper[19170]: I0313 01:45:47.184323 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dbc43a0-77ee-41aa-87b2-730586a6fae4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:47.443224 master-0 kubenswrapper[19170]: I0313 01:45:47.443176 19170 generic.go:334] "Generic (PLEG): container finished" podID="695ba6a1-30ee-4283-b610-072ab694be17" containerID="90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885" exitCode=0 Mar 13 01:45:47.443433 master-0 kubenswrapper[19170]: I0313 01:45:47.443234 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerDied","Data":"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885"} Mar 13 01:45:47.446715 master-0 kubenswrapper[19170]: I0313 01:45:47.446559 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerStarted","Data":"44a94b74db9eb44c8ae267432b09deff59745638400632437a0c42658756ff5c"} Mar 13 01:45:47.446715 master-0 kubenswrapper[19170]: I0313 01:45:47.446613 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerStarted","Data":"35334b7bb9a526cd0fec9834e9c7d8199fbdf00ef6f89d2ad12408e67b450d4b"} Mar 13 01:45:47.446715 master-0 kubenswrapper[19170]: I0313 01:45:47.446641 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:45:47.451976 master-0 kubenswrapper[19170]: I0313 01:45:47.451912 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-db-sync-5fc74" event={"ID":"7dbc43a0-77ee-41aa-87b2-730586a6fae4","Type":"ContainerDied","Data":"bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11"} Mar 13 01:45:47.451976 master-0 kubenswrapper[19170]: I0313 01:45:47.451950 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bceadd6d63e1b08751e663cb6d9879f48fe664d64c06c529e19222b39f52fd11" Mar 13 01:45:47.452279 master-0 kubenswrapper[19170]: I0313 01:45:47.452006 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-db-sync-5fc74" Mar 13 01:45:47.516424 master-0 kubenswrapper[19170]: I0313 01:45:47.516358 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5df95cbc76-9kdvv" podStartSLOduration=3.5163409039999998 podStartE2EDuration="3.516340904s" podCreationTimestamp="2026-03-13 01:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:47.511932739 +0000 UTC m=+1608.320053699" watchObservedRunningTime="2026-03-13 01:45:47.516340904 +0000 UTC m=+1608.324461864" Mar 13 01:45:47.812497 master-0 kubenswrapper[19170]: I0313 01:45:47.811508 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:45:47.812497 master-0 kubenswrapper[19170]: E0313 01:45:47.812117 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbc43a0-77ee-41aa-87b2-730586a6fae4" containerName="cinder-051b7-db-sync" Mar 13 01:45:47.812497 master-0 kubenswrapper[19170]: I0313 01:45:47.812132 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbc43a0-77ee-41aa-87b2-730586a6fae4" containerName="cinder-051b7-db-sync" Mar 13 01:45:47.812497 master-0 kubenswrapper[19170]: I0313 01:45:47.812366 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dbc43a0-77ee-41aa-87b2-730586a6fae4" containerName="cinder-051b7-db-sync" Mar 13 01:45:47.833112 master-0 kubenswrapper[19170]: I0313 01:45:47.833058 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.845598 master-0 kubenswrapper[19170]: I0313 01:45:47.845554 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-config-data" Mar 13 01:45:47.846283 master-0 kubenswrapper[19170]: I0313 01:45:47.846258 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-scheduler-config-data" Mar 13 01:45:47.867657 master-0 kubenswrapper[19170]: I0313 01:45:47.858622 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:45:47.867657 master-0 kubenswrapper[19170]: I0313 01:45:47.859427 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-scripts" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902016 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902092 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902119 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902161 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zdw\" (UniqueName: \"kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902180 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.906688 master-0 kubenswrapper[19170]: I0313 01:45:47.902196 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:47.910649 master-0 kubenswrapper[19170]: I0313 01:45:47.909124 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:45:47.913660 master-0 kubenswrapper[19170]: I0313 01:45:47.910901 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:47.953662 master-0 kubenswrapper[19170]: I0313 01:45:47.946358 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-volume-lvm-iscsi-config-data" Mar 13 01:45:47.990729 master-0 kubenswrapper[19170]: I0313 01:45:47.982142 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.038949 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039008 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039066 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zdw\" (UniqueName: \"kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039098 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039127 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039148 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039173 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039224 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039251 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039287 master-0 kubenswrapper[19170]: I0313 01:45:48.039282 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfplb\" (UniqueName: \"kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039318 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039349 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039377 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039419 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039438 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039495 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039535 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039577 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039602 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039667 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.039900 master-0 kubenswrapper[19170]: I0313 01:45:48.039695 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.064808 master-0 kubenswrapper[19170]: I0313 01:45:48.048373 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.064808 master-0 kubenswrapper[19170]: I0313 01:45:48.050318 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.064808 master-0 kubenswrapper[19170]: I0313 01:45:48.051803 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.099654 master-0 kubenswrapper[19170]: I0313 01:45:48.091187 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.121662 master-0 kubenswrapper[19170]: I0313 01:45:48.118395 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.125665 master-0 kubenswrapper[19170]: I0313 01:45:48.119923 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:45:48.125665 master-0 kubenswrapper[19170]: I0313 01:45:48.125386 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.136896 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-backup-config-data" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143483 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143601 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143623 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143676 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143739 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143761 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143795 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143814 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143837 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfplb\" (UniqueName: \"kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143864 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143884 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143906 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143936 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143953 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.143973 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.144060 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.144667 master-0 kubenswrapper[19170]: I0313 01:45:48.144258 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145417 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145510 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145531 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145584 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145592 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.145539 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.146026 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.146083 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.150823 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.152919 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.156731 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.158643 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.158885 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.159882 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zdw\" (UniqueName: \"kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw\") pod \"cinder-051b7-scheduler-0\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.166547 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfplb\" (UniqueName: \"kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.175818 master-0 kubenswrapper[19170]: I0313 01:45:48.172514 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:45:48.214655 master-0 kubenswrapper[19170]: I0313 01:45:48.214567 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:45:48.216714 master-0 kubenswrapper[19170]: I0313 01:45:48.216666 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.239960 master-0 kubenswrapper[19170]: I0313 01:45:48.238846 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:45:48.241395 master-0 kubenswrapper[19170]: I0313 01:45:48.241175 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:48.243392 master-0 kubenswrapper[19170]: I0313 01:45:48.243352 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.245984 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246044 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246103 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k62s8\" (UniqueName: \"kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246127 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246155 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246192 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246213 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246257 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246271 master-0 kubenswrapper[19170]: I0313 01:45:48.246274 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246310 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246330 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246358 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246380 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246399 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.246826 master-0 kubenswrapper[19170]: I0313 01:45:48.246422 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.247007 master-0 kubenswrapper[19170]: I0313 01:45:48.246884 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-api-config-data" Mar 13 01:45:48.264024 master-0 kubenswrapper[19170]: I0313 01:45:48.263990 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:48.269105 master-0 kubenswrapper[19170]: I0313 01:45:48.269051 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:48.338470 master-0 kubenswrapper[19170]: I0313 01:45:48.338264 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:48.349793 master-0 kubenswrapper[19170]: I0313 01:45:48.349612 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.349793 master-0 kubenswrapper[19170]: I0313 01:45:48.349755 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.349793 master-0 kubenswrapper[19170]: I0313 01:45:48.349790 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349811 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349833 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349853 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349872 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349895 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349923 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k62s8\" (UniqueName: \"kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.349950 master-0 kubenswrapper[19170]: I0313 01:45:48.349942 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.349969 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350002 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350025 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350046 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350071 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350096 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350122 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350141 master-0 kubenswrapper[19170]: I0313 01:45:48.350136 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350153 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350173 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftvj\" (UniqueName: \"kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350196 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtsth\" (UniqueName: \"kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350219 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350238 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350266 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350284 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350303 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350322 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.350372 master-0 kubenswrapper[19170]: I0313 01:45:48.350336 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360221 master-0 kubenswrapper[19170]: I0313 01:45:48.360163 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360322 master-0 kubenswrapper[19170]: I0313 01:45:48.360293 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360372 master-0 kubenswrapper[19170]: I0313 01:45:48.360335 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360372 master-0 kubenswrapper[19170]: I0313 01:45:48.360334 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360461 master-0 kubenswrapper[19170]: I0313 01:45:48.360442 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360508 master-0 kubenswrapper[19170]: I0313 01:45:48.360494 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360567 master-0 kubenswrapper[19170]: I0313 01:45:48.360542 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360614 master-0 kubenswrapper[19170]: I0313 01:45:48.360578 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360614 master-0 kubenswrapper[19170]: I0313 01:45:48.360603 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.360763 master-0 kubenswrapper[19170]: I0313 01:45:48.360680 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.362355 master-0 kubenswrapper[19170]: I0313 01:45:48.362320 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.364932 master-0 kubenswrapper[19170]: I0313 01:45:48.364887 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.370134 master-0 kubenswrapper[19170]: I0313 01:45:48.368274 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.375045 master-0 kubenswrapper[19170]: I0313 01:45:48.375003 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.382982 master-0 kubenswrapper[19170]: I0313 01:45:48.381503 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k62s8\" (UniqueName: \"kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8\") pod \"cinder-051b7-backup-0\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452010 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452097 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452138 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452183 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452208 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452230 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452330 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452416 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452444 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452681 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452713 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftvj\" (UniqueName: \"kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452744 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtsth\" (UniqueName: \"kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453204 master-0 kubenswrapper[19170]: I0313 01:45:48.452823 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.453697 master-0 kubenswrapper[19170]: I0313 01:45:48.453674 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.454125 master-0 kubenswrapper[19170]: I0313 01:45:48.454092 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.455309 master-0 kubenswrapper[19170]: I0313 01:45:48.455130 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.455538 master-0 kubenswrapper[19170]: I0313 01:45:48.455520 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.456281 master-0 kubenswrapper[19170]: I0313 01:45:48.456264 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.457451 master-0 kubenswrapper[19170]: I0313 01:45:48.457418 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.460721 master-0 kubenswrapper[19170]: I0313 01:45:48.458085 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.461284 master-0 kubenswrapper[19170]: I0313 01:45:48.461230 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.461503 master-0 kubenswrapper[19170]: I0313 01:45:48.461469 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.465486 master-0 kubenswrapper[19170]: I0313 01:45:48.465454 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.477123 master-0 kubenswrapper[19170]: I0313 01:45:48.470281 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.487278 master-0 kubenswrapper[19170]: I0313 01:45:48.487237 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftvj\" (UniqueName: \"kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj\") pod \"cinder-051b7-api-0\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.488461 master-0 kubenswrapper[19170]: I0313 01:45:48.488423 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtsth\" (UniqueName: \"kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth\") pod \"dnsmasq-dns-848b9c6b49-shsxt\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.495186 master-0 kubenswrapper[19170]: I0313 01:45:48.494364 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerStarted","Data":"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907"} Mar 13 01:45:48.495186 master-0 kubenswrapper[19170]: I0313 01:45:48.494487 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:48.545697 master-0 kubenswrapper[19170]: I0313 01:45:48.545412 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" podStartSLOduration=4.54538992 podStartE2EDuration="4.54538992s" podCreationTimestamp="2026-03-13 01:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:48.513413289 +0000 UTC m=+1609.321534249" watchObservedRunningTime="2026-03-13 01:45:48.54538992 +0000 UTC m=+1609.353510890" Mar 13 01:45:48.565792 master-0 kubenswrapper[19170]: I0313 01:45:48.563892 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:48.606796 master-0 kubenswrapper[19170]: I0313 01:45:48.603404 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:48.609298 master-0 kubenswrapper[19170]: I0313 01:45:48.609260 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:48.912411 master-0 kubenswrapper[19170]: I0313 01:45:48.909016 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:45:48.989014 master-0 kubenswrapper[19170]: I0313 01:45:48.988586 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:45:48.997042 master-0 kubenswrapper[19170]: W0313 01:45:48.995787 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01cdc83b_2243_40c6_9709_bb442724340c.slice/crio-ec31b9566d8856970f6e30ccf382513e2142665e1beb10602ad7524ab9a79ebc WatchSource:0}: Error finding container ec31b9566d8856970f6e30ccf382513e2142665e1beb10602ad7524ab9a79ebc: Status 404 returned error can't find the container with id ec31b9566d8856970f6e30ccf382513e2142665e1beb10602ad7524ab9a79ebc Mar 13 01:45:49.293023 master-0 kubenswrapper[19170]: I0313 01:45:49.292962 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:45:49.360623 master-0 kubenswrapper[19170]: I0313 01:45:49.360577 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:45:49.414676 master-0 kubenswrapper[19170]: I0313 01:45:49.413841 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:49.508353 master-0 kubenswrapper[19170]: I0313 01:45:49.508295 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" event={"ID":"75f3a8be-2740-470d-994a-e1eecf8327fb","Type":"ContainerStarted","Data":"1ba3b643ea8a884b3501c8caaa355e580bef4280a5042d84c9250ecff823b123"} Mar 13 01:45:49.509697 master-0 kubenswrapper[19170]: I0313 01:45:49.509653 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerStarted","Data":"d2e068efbe66c2b3b12f03087018d5dde46a746fd2992d5561331ffc261cb979"} Mar 13 01:45:49.511273 master-0 kubenswrapper[19170]: I0313 01:45:49.511246 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerStarted","Data":"ec31b9566d8856970f6e30ccf382513e2142665e1beb10602ad7524ab9a79ebc"} Mar 13 01:45:49.517677 master-0 kubenswrapper[19170]: I0313 01:45:49.516854 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerStarted","Data":"90d3a8238a7ed55324b42a0b45944a3d67f5d22099d35ec322081abe69a51584"} Mar 13 01:45:49.525096 master-0 kubenswrapper[19170]: I0313 01:45:49.523807 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="dnsmasq-dns" containerID="cri-o://cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907" gracePeriod=10 Mar 13 01:45:49.525096 master-0 kubenswrapper[19170]: I0313 01:45:49.524066 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerStarted","Data":"b2eb42d5bc3462f10363c73d84019dcfb416aade9ba396a98865889fafc4fbf7"} Mar 13 01:45:50.319843 master-0 kubenswrapper[19170]: I0313 01:45:50.319800 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.442746 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.442832 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.442938 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq598\" (UniqueName: \"kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.442972 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.443129 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.445058 master-0 kubenswrapper[19170]: I0313 01:45:50.443189 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc\") pod \"695ba6a1-30ee-4283-b610-072ab694be17\" (UID: \"695ba6a1-30ee-4283-b610-072ab694be17\") " Mar 13 01:45:50.449528 master-0 kubenswrapper[19170]: I0313 01:45:50.449470 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598" (OuterVolumeSpecName: "kube-api-access-lq598") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "kube-api-access-lq598". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:50.545956 master-0 kubenswrapper[19170]: I0313 01:45:50.545710 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerStarted","Data":"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c"} Mar 13 01:45:50.549477 master-0 kubenswrapper[19170]: I0313 01:45:50.549439 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq598\" (UniqueName: \"kubernetes.io/projected/695ba6a1-30ee-4283-b610-072ab694be17-kube-api-access-lq598\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:50.555666 master-0 kubenswrapper[19170]: I0313 01:45:50.555388 19170 generic.go:334] "Generic (PLEG): container finished" podID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerID="5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1" exitCode=0 Mar 13 01:45:50.555666 master-0 kubenswrapper[19170]: I0313 01:45:50.555456 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" event={"ID":"75f3a8be-2740-470d-994a-e1eecf8327fb","Type":"ContainerDied","Data":"5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1"} Mar 13 01:45:50.559742 master-0 kubenswrapper[19170]: I0313 01:45:50.559692 19170 generic.go:334] "Generic (PLEG): container finished" podID="695ba6a1-30ee-4283-b610-072ab694be17" containerID="cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907" exitCode=0 Mar 13 01:45:50.559742 master-0 kubenswrapper[19170]: I0313 01:45:50.559739 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerDied","Data":"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907"} Mar 13 01:45:50.559924 master-0 kubenswrapper[19170]: I0313 01:45:50.559765 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" event={"ID":"695ba6a1-30ee-4283-b610-072ab694be17","Type":"ContainerDied","Data":"52e2d48a0dd58ee1bd8acfd60ff5995404c04f19756b29167e2534c3a1ca1547"} Mar 13 01:45:50.559924 master-0 kubenswrapper[19170]: I0313 01:45:50.559783 19170 scope.go:117] "RemoveContainer" containerID="cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907" Mar 13 01:45:50.559924 master-0 kubenswrapper[19170]: I0313 01:45:50.559861 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df4f6b69c-hxt7t" Mar 13 01:45:50.629907 master-0 kubenswrapper[19170]: I0313 01:45:50.628924 19170 scope.go:117] "RemoveContainer" containerID="90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885" Mar 13 01:45:50.658846 master-0 kubenswrapper[19170]: I0313 01:45:50.658803 19170 scope.go:117] "RemoveContainer" containerID="cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907" Mar 13 01:45:50.659267 master-0 kubenswrapper[19170]: E0313 01:45:50.659233 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907\": container with ID starting with cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907 not found: ID does not exist" containerID="cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907" Mar 13 01:45:50.659345 master-0 kubenswrapper[19170]: I0313 01:45:50.659311 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907"} err="failed to get container status \"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907\": rpc error: code = NotFound desc = could not find container \"cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907\": container with ID starting with cea18abd91f9dc7a004a175ae6b25a747645a79b4801680549a1e75193aa8907 not found: ID does not exist" Mar 13 01:45:50.659345 master-0 kubenswrapper[19170]: I0313 01:45:50.659341 19170 scope.go:117] "RemoveContainer" containerID="90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885" Mar 13 01:45:50.659624 master-0 kubenswrapper[19170]: E0313 01:45:50.659597 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885\": container with ID starting with 90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885 not found: ID does not exist" containerID="90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885" Mar 13 01:45:50.659693 master-0 kubenswrapper[19170]: I0313 01:45:50.659624 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885"} err="failed to get container status \"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885\": rpc error: code = NotFound desc = could not find container \"90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885\": container with ID starting with 90c5f3195f96aed86e396be92f5dd065b24d2e3538c40ffe8cf80f62443c1885 not found: ID does not exist" Mar 13 01:45:50.798363 master-0 kubenswrapper[19170]: I0313 01:45:50.798280 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:50.810998 master-0 kubenswrapper[19170]: I0313 01:45:50.810956 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:50.827465 master-0 kubenswrapper[19170]: I0313 01:45:50.825456 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config" (OuterVolumeSpecName: "config") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:50.827465 master-0 kubenswrapper[19170]: I0313 01:45:50.825622 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:50.838036 master-0 kubenswrapper[19170]: I0313 01:45:50.838002 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "695ba6a1-30ee-4283-b610-072ab694be17" (UID: "695ba6a1-30ee-4283-b610-072ab694be17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:50.859597 master-0 kubenswrapper[19170]: I0313 01:45:50.857516 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:50.859597 master-0 kubenswrapper[19170]: I0313 01:45:50.857551 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:50.859597 master-0 kubenswrapper[19170]: I0313 01:45:50.857562 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:50.859597 master-0 kubenswrapper[19170]: I0313 01:45:50.857589 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:50.859597 master-0 kubenswrapper[19170]: I0313 01:45:50.857602 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/695ba6a1-30ee-4283-b610-072ab694be17-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:51.110675 master-0 kubenswrapper[19170]: I0313 01:45:51.109714 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:51.138877 master-0 kubenswrapper[19170]: I0313 01:45:51.136084 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5df4f6b69c-hxt7t"] Mar 13 01:45:51.434248 master-0 kubenswrapper[19170]: I0313 01:45:51.434202 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="695ba6a1-30ee-4283-b610-072ab694be17" path="/var/lib/kubelet/pods/695ba6a1-30ee-4283-b610-072ab694be17/volumes" Mar 13 01:45:51.585610 master-0 kubenswrapper[19170]: I0313 01:45:51.585513 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerStarted","Data":"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0"} Mar 13 01:45:51.585992 master-0 kubenswrapper[19170]: I0313 01:45:51.585973 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerStarted","Data":"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e"} Mar 13 01:45:51.590207 master-0 kubenswrapper[19170]: I0313 01:45:51.589999 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerStarted","Data":"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b"} Mar 13 01:45:51.593639 master-0 kubenswrapper[19170]: I0313 01:45:51.593566 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerStarted","Data":"15e6bcf32a26bd96875fa2be8cdd04200634595fb86139799244a034f1c1bd7d"} Mar 13 01:45:51.593758 master-0 kubenswrapper[19170]: I0313 01:45:51.593646 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerStarted","Data":"2913e807e28ceb7cd27836e7db44b97fa6b8ac764f2166361b61ab84f7c4f07a"} Mar 13 01:45:51.601487 master-0 kubenswrapper[19170]: I0313 01:45:51.601328 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" event={"ID":"75f3a8be-2740-470d-994a-e1eecf8327fb","Type":"ContainerStarted","Data":"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139"} Mar 13 01:45:51.602501 master-0 kubenswrapper[19170]: I0313 01:45:51.602372 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:51.710660 master-0 kubenswrapper[19170]: I0313 01:45:51.705688 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-backup-0" podStartSLOduration=3.7821740999999998 podStartE2EDuration="4.705591978s" podCreationTimestamp="2026-03-13 01:45:47 +0000 UTC" firstStartedPulling="2026-03-13 01:45:49.316567548 +0000 UTC m=+1610.124688508" lastFinishedPulling="2026-03-13 01:45:50.239985426 +0000 UTC m=+1611.048106386" observedRunningTime="2026-03-13 01:45:51.647331075 +0000 UTC m=+1612.455452035" watchObservedRunningTime="2026-03-13 01:45:51.705591978 +0000 UTC m=+1612.513712958" Mar 13 01:45:51.715917 master-0 kubenswrapper[19170]: I0313 01:45:51.714598 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" podStartSLOduration=4.714578161 podStartE2EDuration="4.714578161s" podCreationTimestamp="2026-03-13 01:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:51.684428701 +0000 UTC m=+1612.492549661" watchObservedRunningTime="2026-03-13 01:45:51.714578161 +0000 UTC m=+1612.522699171" Mar 13 01:45:51.758268 master-0 kubenswrapper[19170]: I0313 01:45:51.756873 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" podStartSLOduration=3.656072455 podStartE2EDuration="4.75677629s" podCreationTimestamp="2026-03-13 01:45:47 +0000 UTC" firstStartedPulling="2026-03-13 01:45:48.933785468 +0000 UTC m=+1609.741906428" lastFinishedPulling="2026-03-13 01:45:50.034489303 +0000 UTC m=+1610.842610263" observedRunningTime="2026-03-13 01:45:51.722596547 +0000 UTC m=+1612.530717507" watchObservedRunningTime="2026-03-13 01:45:51.75677629 +0000 UTC m=+1612.564897250" Mar 13 01:45:51.884411 master-0 kubenswrapper[19170]: I0313 01:45:51.884321 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:52.058512 master-0 kubenswrapper[19170]: I0313 01:45:52.058420 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59fb7b67df-dqvjc"] Mar 13 01:45:52.059167 master-0 kubenswrapper[19170]: E0313 01:45:52.059070 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="dnsmasq-dns" Mar 13 01:45:52.059167 master-0 kubenswrapper[19170]: I0313 01:45:52.059096 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="dnsmasq-dns" Mar 13 01:45:52.059167 master-0 kubenswrapper[19170]: E0313 01:45:52.059111 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="init" Mar 13 01:45:52.059167 master-0 kubenswrapper[19170]: I0313 01:45:52.059117 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="init" Mar 13 01:45:52.060003 master-0 kubenswrapper[19170]: I0313 01:45:52.059954 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="695ba6a1-30ee-4283-b610-072ab694be17" containerName="dnsmasq-dns" Mar 13 01:45:52.061322 master-0 kubenswrapper[19170]: I0313 01:45:52.061287 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.063727 master-0 kubenswrapper[19170]: I0313 01:45:52.063687 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 01:45:52.063982 master-0 kubenswrapper[19170]: I0313 01:45:52.063957 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 01:45:52.089703 master-0 kubenswrapper[19170]: I0313 01:45:52.086727 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59fb7b67df-dqvjc"] Mar 13 01:45:52.213660 master-0 kubenswrapper[19170]: I0313 01:45:52.213584 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-internal-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.213660 master-0 kubenswrapper[19170]: I0313 01:45:52.213659 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-ovndb-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.213887 master-0 kubenswrapper[19170]: I0313 01:45:52.213682 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.213887 master-0 kubenswrapper[19170]: I0313 01:45:52.213730 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-public-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.213887 master-0 kubenswrapper[19170]: I0313 01:45:52.213812 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt459\" (UniqueName: \"kubernetes.io/projected/ae2bd160-5211-464a-8f97-c011152c3602-kube-api-access-gt459\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.213887 master-0 kubenswrapper[19170]: I0313 01:45:52.213842 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-combined-ca-bundle\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.214021 master-0 kubenswrapper[19170]: I0313 01:45:52.213905 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-httpd-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.316497 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-public-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.316610 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt459\" (UniqueName: \"kubernetes.io/projected/ae2bd160-5211-464a-8f97-c011152c3602-kube-api-access-gt459\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.316661 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-combined-ca-bundle\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.317350 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-httpd-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.317653 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-internal-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.317839 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-ovndb-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.319723 master-0 kubenswrapper[19170]: I0313 01:45:52.317882 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.329666 master-0 kubenswrapper[19170]: I0313 01:45:52.328392 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.329666 master-0 kubenswrapper[19170]: I0313 01:45:52.329382 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-combined-ca-bundle\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.329666 master-0 kubenswrapper[19170]: I0313 01:45:52.329437 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-ovndb-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.335659 master-0 kubenswrapper[19170]: I0313 01:45:52.331545 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-public-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.335659 master-0 kubenswrapper[19170]: I0313 01:45:52.332000 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-httpd-config\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.344659 master-0 kubenswrapper[19170]: I0313 01:45:52.341406 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt459\" (UniqueName: \"kubernetes.io/projected/ae2bd160-5211-464a-8f97-c011152c3602-kube-api-access-gt459\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.349658 master-0 kubenswrapper[19170]: I0313 01:45:52.348353 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2bd160-5211-464a-8f97-c011152c3602-internal-tls-certs\") pod \"neutron-59fb7b67df-dqvjc\" (UID: \"ae2bd160-5211-464a-8f97-c011152c3602\") " pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.429241 master-0 kubenswrapper[19170]: I0313 01:45:52.429120 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:52.665899 master-0 kubenswrapper[19170]: I0313 01:45:52.661442 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerStarted","Data":"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821"} Mar 13 01:45:52.665899 master-0 kubenswrapper[19170]: I0313 01:45:52.662747 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:52.665899 master-0 kubenswrapper[19170]: I0313 01:45:52.665752 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerStarted","Data":"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9"} Mar 13 01:45:52.699933 master-0 kubenswrapper[19170]: I0313 01:45:52.690835 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-api-0" podStartSLOduration=4.6908125389999995 podStartE2EDuration="4.690812539s" podCreationTimestamp="2026-03-13 01:45:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:52.687942438 +0000 UTC m=+1613.496063398" watchObservedRunningTime="2026-03-13 01:45:52.690812539 +0000 UTC m=+1613.498933499" Mar 13 01:45:52.730679 master-0 kubenswrapper[19170]: I0313 01:45:52.730584 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-scheduler-0" podStartSLOduration=4.695623948 podStartE2EDuration="5.730563019s" podCreationTimestamp="2026-03-13 01:45:47 +0000 UTC" firstStartedPulling="2026-03-13 01:45:49.002067963 +0000 UTC m=+1609.810188923" lastFinishedPulling="2026-03-13 01:45:50.037007034 +0000 UTC m=+1610.845127994" observedRunningTime="2026-03-13 01:45:52.723092809 +0000 UTC m=+1613.531213759" watchObservedRunningTime="2026-03-13 01:45:52.730563019 +0000 UTC m=+1613.538683979" Mar 13 01:45:53.081871 master-0 kubenswrapper[19170]: I0313 01:45:53.081810 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59fb7b67df-dqvjc"] Mar 13 01:45:53.266670 master-0 kubenswrapper[19170]: I0313 01:45:53.264739 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:53.339581 master-0 kubenswrapper[19170]: I0313 01:45:53.339468 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:53.570663 master-0 kubenswrapper[19170]: I0313 01:45:53.570419 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:53.701066 master-0 kubenswrapper[19170]: I0313 01:45:53.701018 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59fb7b67df-dqvjc" event={"ID":"ae2bd160-5211-464a-8f97-c011152c3602","Type":"ContainerStarted","Data":"de4526f39cd1cf1b584dae713ad5c63962e43c3497c90e433ddec2923085e231"} Mar 13 01:45:53.701066 master-0 kubenswrapper[19170]: I0313 01:45:53.701060 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59fb7b67df-dqvjc" event={"ID":"ae2bd160-5211-464a-8f97-c011152c3602","Type":"ContainerStarted","Data":"17a44599e103ce50b3b7e8b2a0819fc66dbfdc7c091c5fc42ec314189a625c8e"} Mar 13 01:45:53.705032 master-0 kubenswrapper[19170]: I0313 01:45:53.702611 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-api-0" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-051b7-api-log" containerID="cri-o://3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" gracePeriod=30 Mar 13 01:45:53.705032 master-0 kubenswrapper[19170]: I0313 01:45:53.703108 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-api-0" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-api" containerID="cri-o://ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" gracePeriod=30 Mar 13 01:45:53.842676 master-0 kubenswrapper[19170]: I0313 01:45:53.840083 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:53.848398 master-0 kubenswrapper[19170]: E0313 01:45:53.847006 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7dcf653_5c20_4aed_9c67_afbbb448d466.slice/crio-conmon-3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:45:53.934663 master-0 kubenswrapper[19170]: I0313 01:45:53.933068 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:45:54.261731 master-0 kubenswrapper[19170]: I0313 01:45:54.260735 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d877f97b-4xkgq"] Mar 13 01:45:54.264417 master-0 kubenswrapper[19170]: I0313 01:45:54.263011 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.278950 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-config-data\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279010 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-combined-ca-bundle\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279064 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-internal-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279102 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-public-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279126 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-logs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279141 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-scripts\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.280025 master-0 kubenswrapper[19170]: I0313 01:45:54.279162 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrqp\" (UniqueName: \"kubernetes.io/projected/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-kube-api-access-sfrqp\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.304035 master-0 kubenswrapper[19170]: I0313 01:45:54.302678 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d877f97b-4xkgq"] Mar 13 01:45:54.386378 master-0 kubenswrapper[19170]: I0313 01:45:54.386308 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-combined-ca-bundle\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386579 master-0 kubenswrapper[19170]: I0313 01:45:54.386427 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-internal-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386579 master-0 kubenswrapper[19170]: I0313 01:45:54.386479 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-public-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386579 master-0 kubenswrapper[19170]: I0313 01:45:54.386522 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-logs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386579 master-0 kubenswrapper[19170]: I0313 01:45:54.386541 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-scripts\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386579 master-0 kubenswrapper[19170]: I0313 01:45:54.386566 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrqp\" (UniqueName: \"kubernetes.io/projected/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-kube-api-access-sfrqp\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.386763 master-0 kubenswrapper[19170]: I0313 01:45:54.386699 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-config-data\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.399333 master-0 kubenswrapper[19170]: I0313 01:45:54.399210 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-combined-ca-bundle\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.399559 master-0 kubenswrapper[19170]: I0313 01:45:54.399462 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-logs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.400130 master-0 kubenswrapper[19170]: I0313 01:45:54.400001 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-scripts\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.405993 master-0 kubenswrapper[19170]: I0313 01:45:54.405378 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-internal-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.425226 master-0 kubenswrapper[19170]: I0313 01:45:54.413764 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrqp\" (UniqueName: \"kubernetes.io/projected/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-kube-api-access-sfrqp\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.445051 master-0 kubenswrapper[19170]: I0313 01:45:54.437458 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-config-data\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.470189 master-0 kubenswrapper[19170]: I0313 01:45:54.462472 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4fdb5b-74d4-45c3-a48b-279be6b529a1-public-tls-certs\") pod \"placement-6d877f97b-4xkgq\" (UID: \"3a4fdb5b-74d4-45c3-a48b-279be6b529a1\") " pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.605728 master-0 kubenswrapper[19170]: I0313 01:45:54.605685 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:54.614555 master-0 kubenswrapper[19170]: I0313 01:45:54.610152 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:54.693734 master-0 kubenswrapper[19170]: I0313 01:45:54.693568 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tftvj\" (UniqueName: \"kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694032 master-0 kubenswrapper[19170]: I0313 01:45:54.693952 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694032 master-0 kubenswrapper[19170]: I0313 01:45:54.694015 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694159 master-0 kubenswrapper[19170]: I0313 01:45:54.694080 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694159 master-0 kubenswrapper[19170]: I0313 01:45:54.694135 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694249 master-0 kubenswrapper[19170]: I0313 01:45:54.694182 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.694305 master-0 kubenswrapper[19170]: I0313 01:45:54.694247 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id\") pod \"e7dcf653-5c20-4aed-9c67-afbbb448d466\" (UID: \"e7dcf653-5c20-4aed-9c67-afbbb448d466\") " Mar 13 01:45:54.696849 master-0 kubenswrapper[19170]: I0313 01:45:54.695675 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:45:54.696849 master-0 kubenswrapper[19170]: I0313 01:45:54.696508 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs" (OuterVolumeSpecName: "logs") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:54.699416 master-0 kubenswrapper[19170]: I0313 01:45:54.699120 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj" (OuterVolumeSpecName: "kube-api-access-tftvj") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "kube-api-access-tftvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:54.699905 master-0 kubenswrapper[19170]: I0313 01:45:54.699873 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts" (OuterVolumeSpecName: "scripts") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:54.718294 master-0 kubenswrapper[19170]: I0313 01:45:54.717034 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:54.722407 master-0 kubenswrapper[19170]: I0313 01:45:54.719869 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:54.773090 master-0 kubenswrapper[19170]: I0313 01:45:54.773039 19170 generic.go:334] "Generic (PLEG): container finished" podID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerID="ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" exitCode=0 Mar 13 01:45:54.773090 master-0 kubenswrapper[19170]: I0313 01:45:54.773099 19170 generic.go:334] "Generic (PLEG): container finished" podID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerID="3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" exitCode=143 Mar 13 01:45:54.773576 master-0 kubenswrapper[19170]: I0313 01:45:54.773144 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerDied","Data":"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821"} Mar 13 01:45:54.773576 master-0 kubenswrapper[19170]: I0313 01:45:54.773195 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerDied","Data":"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c"} Mar 13 01:45:54.773576 master-0 kubenswrapper[19170]: I0313 01:45:54.773209 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"e7dcf653-5c20-4aed-9c67-afbbb448d466","Type":"ContainerDied","Data":"b2eb42d5bc3462f10363c73d84019dcfb416aade9ba396a98865889fafc4fbf7"} Mar 13 01:45:54.773576 master-0 kubenswrapper[19170]: I0313 01:45:54.773224 19170 scope.go:117] "RemoveContainer" containerID="ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" Mar 13 01:45:54.773576 master-0 kubenswrapper[19170]: I0313 01:45:54.773373 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:54.778924 master-0 kubenswrapper[19170]: I0313 01:45:54.778884 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59fb7b67df-dqvjc" event={"ID":"ae2bd160-5211-464a-8f97-c011152c3602","Type":"ContainerStarted","Data":"6a951f371f29ed9865895cc9186309875c7a96a02108cb285c1a548b419a7049"} Mar 13 01:45:54.780147 master-0 kubenswrapper[19170]: I0313 01:45:54.780124 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:45:54.783076 master-0 kubenswrapper[19170]: I0313 01:45:54.781802 19170 generic.go:334] "Generic (PLEG): container finished" podID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerID="06ea23d8a95a4410decaf530dfff4e7d87e2ccde7b2a68c947ff45b405941a71" exitCode=0 Mar 13 01:45:54.783076 master-0 kubenswrapper[19170]: I0313 01:45:54.782535 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-l9ds6" event={"ID":"dc3c71f3-cecf-4916-8841-0e557aad23d6","Type":"ContainerDied","Data":"06ea23d8a95a4410decaf530dfff4e7d87e2ccde7b2a68c947ff45b405941a71"} Mar 13 01:45:54.796973 master-0 kubenswrapper[19170]: I0313 01:45:54.795400 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data" (OuterVolumeSpecName: "config-data") pod "e7dcf653-5c20-4aed-9c67-afbbb448d466" (UID: "e7dcf653-5c20-4aed-9c67-afbbb448d466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801667 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7dcf653-5c20-4aed-9c67-afbbb448d466-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801714 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801732 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801748 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801763 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7dcf653-5c20-4aed-9c67-afbbb448d466-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801782 19170 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7dcf653-5c20-4aed-9c67-afbbb448d466-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.803886 master-0 kubenswrapper[19170]: I0313 01:45:54.801921 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tftvj\" (UniqueName: \"kubernetes.io/projected/e7dcf653-5c20-4aed-9c67-afbbb448d466-kube-api-access-tftvj\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:54.815861 master-0 kubenswrapper[19170]: I0313 01:45:54.815762 19170 scope.go:117] "RemoveContainer" containerID="3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" Mar 13 01:45:54.821519 master-0 kubenswrapper[19170]: I0313 01:45:54.821453 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59fb7b67df-dqvjc" podStartSLOduration=3.821430465 podStartE2EDuration="3.821430465s" podCreationTimestamp="2026-03-13 01:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:54.801006669 +0000 UTC m=+1615.609127629" watchObservedRunningTime="2026-03-13 01:45:54.821430465 +0000 UTC m=+1615.629551425" Mar 13 01:45:54.889214 master-0 kubenswrapper[19170]: I0313 01:45:54.889184 19170 scope.go:117] "RemoveContainer" containerID="ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" Mar 13 01:45:54.890030 master-0 kubenswrapper[19170]: E0313 01:45:54.890011 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821\": container with ID starting with ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821 not found: ID does not exist" containerID="ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" Mar 13 01:45:54.890153 master-0 kubenswrapper[19170]: I0313 01:45:54.890129 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821"} err="failed to get container status \"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821\": rpc error: code = NotFound desc = could not find container \"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821\": container with ID starting with ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821 not found: ID does not exist" Mar 13 01:45:54.890227 master-0 kubenswrapper[19170]: I0313 01:45:54.890215 19170 scope.go:117] "RemoveContainer" containerID="3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" Mar 13 01:45:54.891112 master-0 kubenswrapper[19170]: E0313 01:45:54.891094 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c\": container with ID starting with 3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c not found: ID does not exist" containerID="3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" Mar 13 01:45:54.891207 master-0 kubenswrapper[19170]: I0313 01:45:54.891190 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c"} err="failed to get container status \"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c\": rpc error: code = NotFound desc = could not find container \"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c\": container with ID starting with 3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c not found: ID does not exist" Mar 13 01:45:54.891277 master-0 kubenswrapper[19170]: I0313 01:45:54.891265 19170 scope.go:117] "RemoveContainer" containerID="ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821" Mar 13 01:45:54.891546 master-0 kubenswrapper[19170]: I0313 01:45:54.891527 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821"} err="failed to get container status \"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821\": rpc error: code = NotFound desc = could not find container \"ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821\": container with ID starting with ccc874fe40a708dc0ed92b9f62aef0eaf80a9137e07138d326a4eed1741da821 not found: ID does not exist" Mar 13 01:45:54.891650 master-0 kubenswrapper[19170]: I0313 01:45:54.891621 19170 scope.go:117] "RemoveContainer" containerID="3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c" Mar 13 01:45:54.892893 master-0 kubenswrapper[19170]: I0313 01:45:54.892874 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c"} err="failed to get container status \"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c\": rpc error: code = NotFound desc = could not find container \"3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c\": container with ID starting with 3012c76edd6d8aef79f0a4c4340ab282597f721005d818c24dc234ade0e5d20c not found: ID does not exist" Mar 13 01:45:55.124144 master-0 kubenswrapper[19170]: I0313 01:45:55.123994 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:55.157860 master-0 kubenswrapper[19170]: I0313 01:45:55.157739 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:55.171105 master-0 kubenswrapper[19170]: I0313 01:45:55.171055 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d877f97b-4xkgq"] Mar 13 01:45:55.184090 master-0 kubenswrapper[19170]: I0313 01:45:55.184016 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:55.184564 master-0 kubenswrapper[19170]: E0313 01:45:55.184537 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-api" Mar 13 01:45:55.184564 master-0 kubenswrapper[19170]: I0313 01:45:55.184557 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-api" Mar 13 01:45:55.184725 master-0 kubenswrapper[19170]: E0313 01:45:55.184598 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-051b7-api-log" Mar 13 01:45:55.184725 master-0 kubenswrapper[19170]: I0313 01:45:55.184606 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-051b7-api-log" Mar 13 01:45:55.184873 master-0 kubenswrapper[19170]: I0313 01:45:55.184850 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-api" Mar 13 01:45:55.184938 master-0 kubenswrapper[19170]: I0313 01:45:55.184889 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" containerName="cinder-051b7-api-log" Mar 13 01:45:55.186124 master-0 kubenswrapper[19170]: I0313 01:45:55.185994 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.188350 master-0 kubenswrapper[19170]: I0313 01:45:55.188118 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 01:45:55.188350 master-0 kubenswrapper[19170]: I0313 01:45:55.188158 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-api-config-data" Mar 13 01:45:55.188581 master-0 kubenswrapper[19170]: I0313 01:45:55.188414 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 01:45:55.221616 master-0 kubenswrapper[19170]: I0313 01:45:55.220887 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:55.320255 master-0 kubenswrapper[19170]: I0313 01:45:55.320201 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-internal-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320374 master-0 kubenswrapper[19170]: I0313 01:45:55.320266 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320374 master-0 kubenswrapper[19170]: I0313 01:45:55.320360 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320449 master-0 kubenswrapper[19170]: I0313 01:45:55.320392 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c1b89b-6967-4c47-924d-db2e77287e4d-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320449 master-0 kubenswrapper[19170]: I0313 01:45:55.320438 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbgrm\" (UniqueName: \"kubernetes.io/projected/22c1b89b-6967-4c47-924d-db2e77287e4d-kube-api-access-vbgrm\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320512 master-0 kubenswrapper[19170]: I0313 01:45:55.320475 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-public-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320512 master-0 kubenswrapper[19170]: I0313 01:45:55.320500 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320580 master-0 kubenswrapper[19170]: I0313 01:45:55.320517 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-scripts\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.320580 master-0 kubenswrapper[19170]: I0313 01:45:55.320548 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22c1b89b-6967-4c47-924d-db2e77287e4d-logs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423082 master-0 kubenswrapper[19170]: I0313 01:45:55.423034 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbgrm\" (UniqueName: \"kubernetes.io/projected/22c1b89b-6967-4c47-924d-db2e77287e4d-kube-api-access-vbgrm\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423266 master-0 kubenswrapper[19170]: I0313 01:45:55.423100 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-public-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423266 master-0 kubenswrapper[19170]: I0313 01:45:55.423127 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423338 master-0 kubenswrapper[19170]: I0313 01:45:55.423302 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-scripts\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423461 master-0 kubenswrapper[19170]: I0313 01:45:55.423426 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22c1b89b-6967-4c47-924d-db2e77287e4d-logs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423522 master-0 kubenswrapper[19170]: I0313 01:45:55.423474 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-internal-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423522 master-0 kubenswrapper[19170]: I0313 01:45:55.423498 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423612 master-0 kubenswrapper[19170]: I0313 01:45:55.423593 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423687 master-0 kubenswrapper[19170]: I0313 01:45:55.423660 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c1b89b-6967-4c47-924d-db2e77287e4d-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.423978 master-0 kubenswrapper[19170]: I0313 01:45:55.423927 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22c1b89b-6967-4c47-924d-db2e77287e4d-etc-machine-id\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.431227 master-0 kubenswrapper[19170]: I0313 01:45:55.431189 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-internal-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.431454 master-0 kubenswrapper[19170]: I0313 01:45:55.431427 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-scripts\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.431957 master-0 kubenswrapper[19170]: I0313 01:45:55.431926 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22c1b89b-6967-4c47-924d-db2e77287e4d-logs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.434605 master-0 kubenswrapper[19170]: I0313 01:45:55.434479 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-combined-ca-bundle\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.434754 master-0 kubenswrapper[19170]: I0313 01:45:55.434722 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data-custom\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.438231 master-0 kubenswrapper[19170]: I0313 01:45:55.438197 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-public-tls-certs\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.439025 master-0 kubenswrapper[19170]: I0313 01:45:55.438734 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22c1b89b-6967-4c47-924d-db2e77287e4d-config-data\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.451691 master-0 kubenswrapper[19170]: I0313 01:45:55.451625 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dcf653-5c20-4aed-9c67-afbbb448d466" path="/var/lib/kubelet/pods/e7dcf653-5c20-4aed-9c67-afbbb448d466/volumes" Mar 13 01:45:55.455233 master-0 kubenswrapper[19170]: I0313 01:45:55.455187 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbgrm\" (UniqueName: \"kubernetes.io/projected/22c1b89b-6967-4c47-924d-db2e77287e4d-kube-api-access-vbgrm\") pod \"cinder-051b7-api-0\" (UID: \"22c1b89b-6967-4c47-924d-db2e77287e4d\") " pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.507273 master-0 kubenswrapper[19170]: I0313 01:45:55.507234 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:55.817561 master-0 kubenswrapper[19170]: I0313 01:45:55.817252 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d877f97b-4xkgq" event={"ID":"3a4fdb5b-74d4-45c3-a48b-279be6b529a1","Type":"ContainerStarted","Data":"915edbaa80a28534a110e0bf634b8ceee4fb9ae552274845cf3cf50ccf022828"} Mar 13 01:45:55.817561 master-0 kubenswrapper[19170]: I0313 01:45:55.817328 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d877f97b-4xkgq" event={"ID":"3a4fdb5b-74d4-45c3-a48b-279be6b529a1","Type":"ContainerStarted","Data":"5cd4938a28a6c45fc0bc9bdd6744fe8f9b1890432ac69a1e6331f384de379ca8"} Mar 13 01:45:55.817561 master-0 kubenswrapper[19170]: I0313 01:45:55.817340 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d877f97b-4xkgq" event={"ID":"3a4fdb5b-74d4-45c3-a48b-279be6b529a1","Type":"ContainerStarted","Data":"594343ff919c7c713e9c135a575e10c4273fe1bec46d64b186fa372d425182ba"} Mar 13 01:45:55.820747 master-0 kubenswrapper[19170]: I0313 01:45:55.818847 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:55.820747 master-0 kubenswrapper[19170]: I0313 01:45:55.818893 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:45:55.862689 master-0 kubenswrapper[19170]: I0313 01:45:55.861838 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d877f97b-4xkgq" podStartSLOduration=1.8618201810000001 podStartE2EDuration="1.861820181s" podCreationTimestamp="2026-03-13 01:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:55.846119729 +0000 UTC m=+1616.654240689" watchObservedRunningTime="2026-03-13 01:45:55.861820181 +0000 UTC m=+1616.669941141" Mar 13 01:45:56.119784 master-0 kubenswrapper[19170]: I0313 01:45:56.107282 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-api-0"] Mar 13 01:45:56.203569 master-0 kubenswrapper[19170]: I0313 01:45:56.203453 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:56.345574 master-0 kubenswrapper[19170]: I0313 01:45:56.345502 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.345813 master-0 kubenswrapper[19170]: I0313 01:45:56.345600 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.345964 master-0 kubenswrapper[19170]: I0313 01:45:56.345919 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56r7z\" (UniqueName: \"kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.346254 master-0 kubenswrapper[19170]: I0313 01:45:56.346226 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.346321 master-0 kubenswrapper[19170]: I0313 01:45:56.346300 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.346385 master-0 kubenswrapper[19170]: I0313 01:45:56.346370 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data\") pod \"dc3c71f3-cecf-4916-8841-0e557aad23d6\" (UID: \"dc3c71f3-cecf-4916-8841-0e557aad23d6\") " Mar 13 01:45:56.346827 master-0 kubenswrapper[19170]: I0313 01:45:56.346778 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:45:56.347211 master-0 kubenswrapper[19170]: I0313 01:45:56.347189 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.354368 master-0 kubenswrapper[19170]: I0313 01:45:56.349439 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 01:45:56.354368 master-0 kubenswrapper[19170]: I0313 01:45:56.350001 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts" (OuterVolumeSpecName: "scripts") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:56.354368 master-0 kubenswrapper[19170]: I0313 01:45:56.350117 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z" (OuterVolumeSpecName: "kube-api-access-56r7z") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "kube-api-access-56r7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:56.373453 master-0 kubenswrapper[19170]: I0313 01:45:56.373385 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data" (OuterVolumeSpecName: "config-data") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:56.421335 master-0 kubenswrapper[19170]: I0313 01:45:56.421278 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3c71f3-cecf-4916-8841-0e557aad23d6" (UID: "dc3c71f3-cecf-4916-8841-0e557aad23d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:45:56.450172 master-0 kubenswrapper[19170]: I0313 01:45:56.450135 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.450172 master-0 kubenswrapper[19170]: I0313 01:45:56.450171 19170 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dc3c71f3-cecf-4916-8841-0e557aad23d6-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.450275 master-0 kubenswrapper[19170]: I0313 01:45:56.450181 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.450275 master-0 kubenswrapper[19170]: I0313 01:45:56.450191 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56r7z\" (UniqueName: \"kubernetes.io/projected/dc3c71f3-cecf-4916-8841-0e557aad23d6-kube-api-access-56r7z\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.450275 master-0 kubenswrapper[19170]: I0313 01:45:56.450201 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3c71f3-cecf-4916-8841-0e557aad23d6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:56.847821 master-0 kubenswrapper[19170]: I0313 01:45:56.847536 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"22c1b89b-6967-4c47-924d-db2e77287e4d","Type":"ContainerStarted","Data":"a770a7ef93c2a089be24b136a577c409db58bdaef12d09c05706dd1ef2525cf1"} Mar 13 01:45:56.847821 master-0 kubenswrapper[19170]: I0313 01:45:56.847586 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"22c1b89b-6967-4c47-924d-db2e77287e4d","Type":"ContainerStarted","Data":"925e50ad028728c36f791a4c1c52f551d1525a752ad6dcec52456258dd09c8f3"} Mar 13 01:45:56.853258 master-0 kubenswrapper[19170]: I0313 01:45:56.853097 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-l9ds6" event={"ID":"dc3c71f3-cecf-4916-8841-0e557aad23d6","Type":"ContainerDied","Data":"464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090"} Mar 13 01:45:56.853258 master-0 kubenswrapper[19170]: I0313 01:45:56.853140 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464285cf1424e1182ab2d281529561afcee621bd4c4764a14903339d0a63b090" Mar 13 01:45:56.853258 master-0 kubenswrapper[19170]: I0313 01:45:56.853226 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-l9ds6" Mar 13 01:45:57.224423 master-0 kubenswrapper[19170]: I0313 01:45:57.224048 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-xckgg"] Mar 13 01:45:57.224622 master-0 kubenswrapper[19170]: E0313 01:45:57.224586 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerName="ironic-db-sync" Mar 13 01:45:57.224622 master-0 kubenswrapper[19170]: I0313 01:45:57.224600 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerName="ironic-db-sync" Mar 13 01:45:57.224622 master-0 kubenswrapper[19170]: E0313 01:45:57.224622 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerName="init" Mar 13 01:45:57.224747 master-0 kubenswrapper[19170]: I0313 01:45:57.224631 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerName="init" Mar 13 01:45:57.224932 master-0 kubenswrapper[19170]: I0313 01:45:57.224906 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3c71f3-cecf-4916-8841-0e557aad23d6" containerName="ironic-db-sync" Mar 13 01:45:57.225926 master-0 kubenswrapper[19170]: I0313 01:45:57.225612 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.238351 master-0 kubenswrapper[19170]: I0313 01:45:57.238298 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-xckgg"] Mar 13 01:45:57.376784 master-0 kubenswrapper[19170]: I0313 01:45:57.376655 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.376784 master-0 kubenswrapper[19170]: I0313 01:45:57.376723 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pss8h\" (UniqueName: \"kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.401379 master-0 kubenswrapper[19170]: I0313 01:45:57.401335 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-f67c-account-create-update-4h2l7"] Mar 13 01:45:57.402906 master-0 kubenswrapper[19170]: I0313 01:45:57.402822 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.427398 master-0 kubenswrapper[19170]: I0313 01:45:57.424048 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 13 01:45:57.480073 master-0 kubenswrapper[19170]: I0313 01:45:57.478750 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.480073 master-0 kubenswrapper[19170]: I0313 01:45:57.478814 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pss8h\" (UniqueName: \"kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.505857 master-0 kubenswrapper[19170]: I0313 01:45:57.504806 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.533672 master-0 kubenswrapper[19170]: I0313 01:45:57.530774 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-84f79fc9fc-7t7j5"] Mar 13 01:45:57.540349 master-0 kubenswrapper[19170]: I0313 01:45:57.540305 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pss8h\" (UniqueName: \"kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h\") pod \"ironic-inspector-db-create-xckgg\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.565642 master-0 kubenswrapper[19170]: I0313 01:45:57.565587 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-f67c-account-create-update-4h2l7"] Mar 13 01:45:57.565807 master-0 kubenswrapper[19170]: I0313 01:45:57.565720 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.570614 master-0 kubenswrapper[19170]: I0313 01:45:57.570569 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 13 01:45:57.578052 master-0 kubenswrapper[19170]: I0313 01:45:57.577830 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:45:57.608538 master-0 kubenswrapper[19170]: I0313 01:45:57.605922 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pscwm\" (UniqueName: \"kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.608538 master-0 kubenswrapper[19170]: I0313 01:45:57.606183 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.651786 master-0 kubenswrapper[19170]: I0313 01:45:57.646630 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-84f79fc9fc-7t7j5"] Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.733059 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.733136 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-combined-ca-bundle\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.733256 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-config\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.733284 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pscwm\" (UniqueName: \"kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.733300 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjxs\" (UniqueName: \"kubernetes.io/projected/e18f1449-3fb3-43a8-98cd-2f8713bba98d-kube-api-access-7rjxs\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.734380 master-0 kubenswrapper[19170]: I0313 01:45:57.734031 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.761309 master-0 kubenswrapper[19170]: I0313 01:45:57.759908 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pscwm\" (UniqueName: \"kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm\") pod \"ironic-inspector-f67c-account-create-update-4h2l7\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.766100 master-0 kubenswrapper[19170]: I0313 01:45:57.765831 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:45:57.766237 master-0 kubenswrapper[19170]: I0313 01:45:57.766143 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="dnsmasq-dns" containerID="cri-o://52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139" gracePeriod=10 Mar 13 01:45:57.767401 master-0 kubenswrapper[19170]: I0313 01:45:57.767359 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:57.843504 master-0 kubenswrapper[19170]: I0313 01:45:57.841012 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-config\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.843504 master-0 kubenswrapper[19170]: I0313 01:45:57.841069 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjxs\" (UniqueName: \"kubernetes.io/projected/e18f1449-3fb3-43a8-98cd-2f8713bba98d-kube-api-access-7rjxs\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.843504 master-0 kubenswrapper[19170]: I0313 01:45:57.841179 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-combined-ca-bundle\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.856186 master-0 kubenswrapper[19170]: I0313 01:45:57.851271 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-combined-ca-bundle\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.867227 master-0 kubenswrapper[19170]: I0313 01:45:57.863696 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:45:57.867227 master-0 kubenswrapper[19170]: I0313 01:45:57.865682 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:57.889126 master-0 kubenswrapper[19170]: I0313 01:45:57.888242 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:45:57.897131 master-0 kubenswrapper[19170]: I0313 01:45:57.896484 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e18f1449-3fb3-43a8-98cd-2f8713bba98d-config\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.921136 master-0 kubenswrapper[19170]: I0313 01:45:57.918037 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjxs\" (UniqueName: \"kubernetes.io/projected/e18f1449-3fb3-43a8-98cd-2f8713bba98d-kube-api-access-7rjxs\") pod \"ironic-neutron-agent-84f79fc9fc-7t7j5\" (UID: \"e18f1449-3fb3-43a8-98cd-2f8713bba98d\") " pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:57.921418 master-0 kubenswrapper[19170]: I0313 01:45:57.921249 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:45:57.956841 master-0 kubenswrapper[19170]: I0313 01:45:57.956185 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:45:58.062689 master-0 kubenswrapper[19170]: I0313 01:45:58.059586 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwcn\" (UniqueName: \"kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.062689 master-0 kubenswrapper[19170]: I0313 01:45:58.059678 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.092795 master-0 kubenswrapper[19170]: I0313 01:45:58.078517 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.092795 master-0 kubenswrapper[19170]: I0313 01:45:58.078597 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.092795 master-0 kubenswrapper[19170]: I0313 01:45:58.078673 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.092795 master-0 kubenswrapper[19170]: I0313 01:45:58.078772 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183326 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183396 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183487 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183620 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwcn\" (UniqueName: \"kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.187823 master-0 kubenswrapper[19170]: I0313 01:45:58.183670 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.199658 master-0 kubenswrapper[19170]: I0313 01:45:58.198217 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.199658 master-0 kubenswrapper[19170]: I0313 01:45:58.198795 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.199658 master-0 kubenswrapper[19170]: I0313 01:45:58.199540 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.207907 master-0 kubenswrapper[19170]: I0313 01:45:58.207818 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.209330 master-0 kubenswrapper[19170]: I0313 01:45:58.209293 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.224974 master-0 kubenswrapper[19170]: I0313 01:45:58.224519 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:45:58.236386 master-0 kubenswrapper[19170]: I0313 01:45:58.231384 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.236386 master-0 kubenswrapper[19170]: I0313 01:45:58.235850 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 13 01:45:58.236386 master-0 kubenswrapper[19170]: I0313 01:45:58.236030 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 13 01:45:58.236386 master-0 kubenswrapper[19170]: I0313 01:45:58.236177 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 01:45:58.237241 master-0 kubenswrapper[19170]: I0313 01:45:58.236789 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 13 01:45:58.245567 master-0 kubenswrapper[19170]: I0313 01:45:58.245277 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 13 01:45:58.248137 master-0 kubenswrapper[19170]: I0313 01:45:58.248080 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwcn\" (UniqueName: \"kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn\") pod \"dnsmasq-dns-9c57cd77c-64hkx\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.297661 master-0 kubenswrapper[19170]: I0313 01:45:58.296434 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:45:58.307652 master-0 kubenswrapper[19170]: I0313 01:45:58.304299 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.308415 master-0 kubenswrapper[19170]: I0313 01:45:58.308385 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.308791 master-0 kubenswrapper[19170]: I0313 01:45:58.308775 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.308972 master-0 kubenswrapper[19170]: I0313 01:45:58.308957 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.309177 master-0 kubenswrapper[19170]: I0313 01:45:58.309154 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.309371 master-0 kubenswrapper[19170]: I0313 01:45:58.309355 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxxr\" (UniqueName: \"kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.309592 master-0 kubenswrapper[19170]: I0313 01:45:58.309576 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.309767 master-0 kubenswrapper[19170]: I0313 01:45:58.309753 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.458017 master-0 kubenswrapper[19170]: I0313 01:45:58.457454 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.458415 master-0 kubenswrapper[19170]: I0313 01:45:58.458381 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.468651 master-0 kubenswrapper[19170]: I0313 01:45:58.457842 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.469088 master-0 kubenswrapper[19170]: I0313 01:45:58.469051 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.469385 master-0 kubenswrapper[19170]: I0313 01:45:58.469370 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.469492 master-0 kubenswrapper[19170]: I0313 01:45:58.469479 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxxr\" (UniqueName: \"kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.469765 master-0 kubenswrapper[19170]: I0313 01:45:58.469750 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.469842 master-0 kubenswrapper[19170]: I0313 01:45:58.469831 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.470089 master-0 kubenswrapper[19170]: I0313 01:45:58.470075 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.474224 master-0 kubenswrapper[19170]: I0313 01:45:58.474197 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.494098 master-0 kubenswrapper[19170]: I0313 01:45:58.494054 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.498024 master-0 kubenswrapper[19170]: I0313 01:45:58.494359 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.498024 master-0 kubenswrapper[19170]: I0313 01:45:58.494366 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.544988 master-0 kubenswrapper[19170]: I0313 01:45:58.544949 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxxr\" (UniqueName: \"kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.545375 master-0 kubenswrapper[19170]: I0313 01:45:58.545292 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.545774 master-0 kubenswrapper[19170]: I0313 01:45:58.545753 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data\") pod \"ironic-86cc47d7f5-mk8sb\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.591889 master-0 kubenswrapper[19170]: I0313 01:45:58.591824 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:45:58.612298 master-0 kubenswrapper[19170]: I0313 01:45:58.612208 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-xckgg"] Mar 13 01:45:58.711377 master-0 kubenswrapper[19170]: I0313 01:45:58.702390 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:45:58.862701 master-0 kubenswrapper[19170]: I0313 01:45:58.862657 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:45:58.866392 master-0 kubenswrapper[19170]: I0313 01:45:58.866051 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-f67c-account-create-update-4h2l7"] Mar 13 01:45:58.949077 master-0 kubenswrapper[19170]: I0313 01:45:58.949035 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:45:59.003677 master-0 kubenswrapper[19170]: I0313 01:45:59.003590 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-xckgg" event={"ID":"15bf14df-482c-4bc8-a222-25900daba262","Type":"ContainerStarted","Data":"ba7f7a4786d0a732b2a26c9556818342343b0b6c54ebdae306f738020d1fecfb"} Mar 13 01:45:59.025825 master-0 kubenswrapper[19170]: I0313 01:45:59.025740 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:45:59.038418 master-0 kubenswrapper[19170]: I0313 01:45:59.038314 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:59.047145 master-0 kubenswrapper[19170]: I0313 01:45:59.043976 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-api-0" event={"ID":"22c1b89b-6967-4c47-924d-db2e77287e4d","Type":"ContainerStarted","Data":"14f5f3d92961f02301fe03a51feb4b4deca9c05836cbe7516af27c71880bd5f9"} Mar 13 01:45:59.050650 master-0 kubenswrapper[19170]: I0313 01:45:59.050565 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-051b7-api-0" Mar 13 01:45:59.084746 master-0 kubenswrapper[19170]: I0313 01:45:59.084694 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-84f79fc9fc-7t7j5"] Mar 13 01:45:59.092709 master-0 kubenswrapper[19170]: I0313 01:45:59.092598 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" event={"ID":"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d","Type":"ContainerStarted","Data":"92e17d8d977df013c61dd97238eb2783b1bbcbbf92650a181ed851406d2e16dc"} Mar 13 01:45:59.120752 master-0 kubenswrapper[19170]: W0313 01:45:59.119244 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode18f1449_3fb3_43a8_98cd_2f8713bba98d.slice/crio-7212a61a355508ebf15907644929a895fabd52989d79d82e9a61438d9768c2e4 WatchSource:0}: Error finding container 7212a61a355508ebf15907644929a895fabd52989d79d82e9a61438d9768c2e4: Status 404 returned error can't find the container with id 7212a61a355508ebf15907644929a895fabd52989d79d82e9a61438d9768c2e4 Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.129780 19170 generic.go:334] "Generic (PLEG): container finished" podID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerID="52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139" exitCode=0 Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130001 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-scheduler-0" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="cinder-scheduler" containerID="cri-o://8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b" gracePeriod=30 Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130138 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130615 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" event={"ID":"75f3a8be-2740-470d-994a-e1eecf8327fb","Type":"ContainerDied","Data":"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139"} Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130659 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" event={"ID":"75f3a8be-2740-470d-994a-e1eecf8327fb","Type":"ContainerDied","Data":"1ba3b643ea8a884b3501c8caaa355e580bef4280a5042d84c9250ecff823b123"} Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130676 19170 scope.go:117] "RemoveContainer" containerID="52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139" Mar 13 01:45:59.131507 master-0 kubenswrapper[19170]: I0313 01:45:59.130776 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-scheduler-0" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="probe" containerID="cri-o://3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9" gracePeriod=30 Mar 13 01:45:59.245582 master-0 kubenswrapper[19170]: I0313 01:45:59.244417 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.245582 master-0 kubenswrapper[19170]: I0313 01:45:59.244543 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtsth\" (UniqueName: \"kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.245582 master-0 kubenswrapper[19170]: I0313 01:45:59.244578 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.245582 master-0 kubenswrapper[19170]: I0313 01:45:59.244615 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.245582 master-0 kubenswrapper[19170]: I0313 01:45:59.245341 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.253152 master-0 kubenswrapper[19170]: I0313 01:45:59.246170 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb\") pod \"75f3a8be-2740-470d-994a-e1eecf8327fb\" (UID: \"75f3a8be-2740-470d-994a-e1eecf8327fb\") " Mar 13 01:45:59.302403 master-0 kubenswrapper[19170]: I0313 01:45:59.302336 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:45:59.302934 master-0 kubenswrapper[19170]: I0313 01:45:59.302918 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-backup-0" Mar 13 01:45:59.310863 master-0 kubenswrapper[19170]: I0313 01:45:59.310644 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-api-0" podStartSLOduration=4.3106116629999995 podStartE2EDuration="4.310611663s" podCreationTimestamp="2026-03-13 01:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:45:59.125171966 +0000 UTC m=+1619.933292926" watchObservedRunningTime="2026-03-13 01:45:59.310611663 +0000 UTC m=+1620.118732623" Mar 13 01:45:59.312155 master-0 kubenswrapper[19170]: I0313 01:45:59.311999 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth" (OuterVolumeSpecName: "kube-api-access-rtsth") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "kube-api-access-rtsth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:45:59.344009 master-0 kubenswrapper[19170]: I0313 01:45:59.343231 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:45:59.347852 master-0 kubenswrapper[19170]: I0313 01:45:59.346260 19170 scope.go:117] "RemoveContainer" containerID="5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1" Mar 13 01:45:59.364817 master-0 kubenswrapper[19170]: I0313 01:45:59.360939 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtsth\" (UniqueName: \"kubernetes.io/projected/75f3a8be-2740-470d-994a-e1eecf8327fb-kube-api-access-rtsth\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:59.392474 master-0 kubenswrapper[19170]: I0313 01:45:59.392423 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: I0313 01:45:59.406592 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: E0313 01:45:59.407155 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="init" Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: I0313 01:45:59.407175 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="init" Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: E0313 01:45:59.407232 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="dnsmasq-dns" Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: I0313 01:45:59.407239 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="dnsmasq-dns" Mar 13 01:45:59.407525 master-0 kubenswrapper[19170]: I0313 01:45:59.407492 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="dnsmasq-dns" Mar 13 01:45:59.411379 master-0 kubenswrapper[19170]: I0313 01:45:59.411259 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 13 01:45:59.417444 master-0 kubenswrapper[19170]: I0313 01:45:59.416105 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 13 01:45:59.417444 master-0 kubenswrapper[19170]: I0313 01:45:59.416285 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 13 01:45:59.464289 master-0 kubenswrapper[19170]: I0313 01:45:59.464232 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hvrb\" (UniqueName: \"kubernetes.io/projected/3785df35-68b7-4d28-8b4a-39c3136ce823-kube-api-access-6hvrb\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.464289 master-0 kubenswrapper[19170]: I0313 01:45:59.464289 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464383 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4cb08518-316b-4115-97b8-5f0c654b6aad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1525093-ba79-4899-9c06-240e9f4d5b86\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464428 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3785df35-68b7-4d28-8b4a-39c3136ce823-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464464 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464493 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464528 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-scripts\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.465002 master-0 kubenswrapper[19170]: I0313 01:45:59.464591 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.554097 master-0 kubenswrapper[19170]: I0313 01:45:59.552437 19170 scope.go:117] "RemoveContainer" containerID="52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139" Mar 13 01:45:59.558735 master-0 kubenswrapper[19170]: E0313 01:45:59.556273 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139\": container with ID starting with 52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139 not found: ID does not exist" containerID="52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139" Mar 13 01:45:59.558735 master-0 kubenswrapper[19170]: I0313 01:45:59.556338 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139"} err="failed to get container status \"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139\": rpc error: code = NotFound desc = could not find container \"52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139\": container with ID starting with 52b28cb0dee1ca9103684cb5e5bd7c4fa4545f0c79c09e206a522ecaa46bf139 not found: ID does not exist" Mar 13 01:45:59.558735 master-0 kubenswrapper[19170]: I0313 01:45:59.556371 19170 scope.go:117] "RemoveContainer" containerID="5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1" Mar 13 01:45:59.558735 master-0 kubenswrapper[19170]: E0313 01:45:59.557092 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1\": container with ID starting with 5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1 not found: ID does not exist" containerID="5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1" Mar 13 01:45:59.558735 master-0 kubenswrapper[19170]: I0313 01:45:59.557180 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1"} err="failed to get container status \"5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1\": rpc error: code = NotFound desc = could not find container \"5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1\": container with ID starting with 5da906adcae163035931eac6068f9ab04c2bbd98ca4b8ce4ebee8a8e3b29e3c1 not found: ID does not exist" Mar 13 01:45:59.574584 master-0 kubenswrapper[19170]: I0313 01:45:59.574518 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-scripts\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.574811 master-0 kubenswrapper[19170]: I0313 01:45:59.574746 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575007 master-0 kubenswrapper[19170]: I0313 01:45:59.574978 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hvrb\" (UniqueName: \"kubernetes.io/projected/3785df35-68b7-4d28-8b4a-39c3136ce823-kube-api-access-6hvrb\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575082 master-0 kubenswrapper[19170]: I0313 01:45:59.575025 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575468 master-0 kubenswrapper[19170]: I0313 01:45:59.575261 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4cb08518-316b-4115-97b8-5f0c654b6aad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1525093-ba79-4899-9c06-240e9f4d5b86\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575468 master-0 kubenswrapper[19170]: I0313 01:45:59.575345 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3785df35-68b7-4d28-8b4a-39c3136ce823-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575468 master-0 kubenswrapper[19170]: I0313 01:45:59.575436 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.575575 master-0 kubenswrapper[19170]: I0313 01:45:59.575485 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.583512 master-0 kubenswrapper[19170]: I0313 01:45:59.583108 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.629421 master-0 kubenswrapper[19170]: I0313 01:45:59.629390 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:45:59.630149 master-0 kubenswrapper[19170]: I0313 01:45:59.630127 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4cb08518-316b-4115-97b8-5f0c654b6aad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1525093-ba79-4899-9c06-240e9f4d5b86\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/72c7508c24a020cf9b3d2312ac300881614e9139f2f19ef8179ccf734cf3e163/globalmount\"" pod="openstack/ironic-conductor-0" Mar 13 01:45:59.681429 master-0 kubenswrapper[19170]: I0313 01:45:59.681229 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3785df35-68b7-4d28-8b4a-39c3136ce823-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.681429 master-0 kubenswrapper[19170]: I0313 01:45:59.681404 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.682427 master-0 kubenswrapper[19170]: I0313 01:45:59.681388 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-scripts\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.682930 master-0 kubenswrapper[19170]: I0313 01:45:59.682706 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.683861 master-0 kubenswrapper[19170]: I0313 01:45:59.683718 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3785df35-68b7-4d28-8b4a-39c3136ce823-config-data\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.689793 master-0 kubenswrapper[19170]: I0313 01:45:59.689525 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hvrb\" (UniqueName: \"kubernetes.io/projected/3785df35-68b7-4d28-8b4a-39c3136ce823-kube-api-access-6hvrb\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:45:59.853416 master-0 kubenswrapper[19170]: I0313 01:45:59.853363 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:45:59.927684 master-0 kubenswrapper[19170]: I0313 01:45:59.927522 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:45:59.988102 master-0 kubenswrapper[19170]: I0313 01:45:59.988058 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:00.004116 master-0 kubenswrapper[19170]: I0313 01:46:00.004065 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config" (OuterVolumeSpecName: "config") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:00.014738 master-0 kubenswrapper[19170]: I0313 01:46:00.013576 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:00.019228 master-0 kubenswrapper[19170]: I0313 01:46:00.019180 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75f3a8be-2740-470d-994a-e1eecf8327fb" (UID: "75f3a8be-2740-470d-994a-e1eecf8327fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:00.035969 master-0 kubenswrapper[19170]: I0313 01:46:00.033829 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:00.035969 master-0 kubenswrapper[19170]: I0313 01:46:00.033912 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:00.035969 master-0 kubenswrapper[19170]: I0313 01:46:00.033923 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:00.035969 master-0 kubenswrapper[19170]: I0313 01:46:00.033932 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75f3a8be-2740-470d-994a-e1eecf8327fb-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:00.114484 master-0 kubenswrapper[19170]: I0313 01:46:00.114438 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 13 01:46:00.114620 master-0 kubenswrapper[19170]: I0313 01:46:00.114501 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:46:00.164654 master-0 kubenswrapper[19170]: I0313 01:46:00.164507 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-xckgg" event={"ID":"15bf14df-482c-4bc8-a222-25900daba262","Type":"ContainerStarted","Data":"e67f6eb160137cba7b251a1a03247acafec598302319c03f5f1e18f2586899d1"} Mar 13 01:46:00.168652 master-0 kubenswrapper[19170]: I0313 01:46:00.167485 19170 generic.go:334] "Generic (PLEG): container finished" podID="01cdc83b-2243-40c6-9709-bb442724340c" containerID="3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9" exitCode=0 Mar 13 01:46:00.168652 master-0 kubenswrapper[19170]: I0313 01:46:00.167556 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerDied","Data":"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9"} Mar 13 01:46:00.168652 master-0 kubenswrapper[19170]: I0313 01:46:00.168588 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" event={"ID":"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d","Type":"ContainerStarted","Data":"ac7e0da1c47b229ae199edcd6263504784836c1d1bba485bd9e9e7c11f1c9e82"} Mar 13 01:46:00.174169 master-0 kubenswrapper[19170]: I0313 01:46:00.170516 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" event={"ID":"97b91c16-ada4-4c3a-8e56-bd4e6417ded3","Type":"ContainerStarted","Data":"ec84eb6576222a3979a88aeb20e70b776c6615129b337bcac33878d99776c0b4"} Mar 13 01:46:00.178908 master-0 kubenswrapper[19170]: I0313 01:46:00.174261 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerStarted","Data":"7212a61a355508ebf15907644929a895fabd52989d79d82e9a61438d9768c2e4"} Mar 13 01:46:00.191648 master-0 kubenswrapper[19170]: I0313 01:46:00.188030 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-create-xckgg" podStartSLOduration=3.188013445 podStartE2EDuration="3.188013445s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:00.177430257 +0000 UTC m=+1620.985551217" watchObservedRunningTime="2026-03-13 01:46:00.188013445 +0000 UTC m=+1620.996134405" Mar 13 01:46:00.202204 master-0 kubenswrapper[19170]: I0313 01:46:00.198222 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-backup-0" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="cinder-backup" containerID="cri-o://6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e" gracePeriod=30 Mar 13 01:46:00.202204 master-0 kubenswrapper[19170]: I0313 01:46:00.198333 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerStarted","Data":"4865a89b78ff8fbceda9b5950940d733b12f5cd684365b375f8d75f88f1e0833"} Mar 13 01:46:00.202204 master-0 kubenswrapper[19170]: I0313 01:46:00.198502 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="cinder-volume" containerID="cri-o://2913e807e28ceb7cd27836e7db44b97fa6b8ac764f2166361b61ab84f7c4f07a" gracePeriod=30 Mar 13 01:46:00.202204 master-0 kubenswrapper[19170]: I0313 01:46:00.198663 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="probe" containerID="cri-o://15e6bcf32a26bd96875fa2be8cdd04200634595fb86139799244a034f1c1bd7d" gracePeriod=30 Mar 13 01:46:00.202204 master-0 kubenswrapper[19170]: I0313 01:46:00.198703 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-051b7-backup-0" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="probe" containerID="cri-o://ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0" gracePeriod=30 Mar 13 01:46:00.209665 master-0 kubenswrapper[19170]: I0313 01:46:00.206306 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" podStartSLOduration=3.20628588 podStartE2EDuration="3.20628588s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:00.198019667 +0000 UTC m=+1621.006140627" watchObservedRunningTime="2026-03-13 01:46:00.20628588 +0000 UTC m=+1621.014406840" Mar 13 01:46:00.275209 master-0 kubenswrapper[19170]: I0313 01:46:00.275112 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:46:00.341251 master-0 kubenswrapper[19170]: I0313 01:46:00.302624 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848b9c6b49-shsxt"] Mar 13 01:46:01.020065 master-0 kubenswrapper[19170]: I0313 01:46:01.020016 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.162601 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.162677 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.162762 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.162901 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.163044 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92zdw\" (UniqueName: \"kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.163713 master-0 kubenswrapper[19170]: I0313 01:46:01.163073 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle\") pod \"01cdc83b-2243-40c6-9709-bb442724340c\" (UID: \"01cdc83b-2243-40c6-9709-bb442724340c\") " Mar 13 01:46:01.169099 master-0 kubenswrapper[19170]: I0313 01:46:01.167411 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:01.173028 master-0 kubenswrapper[19170]: I0313 01:46:01.172822 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:01.174062 master-0 kubenswrapper[19170]: I0313 01:46:01.174008 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts" (OuterVolumeSpecName: "scripts") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:01.177841 master-0 kubenswrapper[19170]: I0313 01:46:01.177543 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw" (OuterVolumeSpecName: "kube-api-access-92zdw") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "kube-api-access-92zdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:01.236309 master-0 kubenswrapper[19170]: I0313 01:46:01.236238 19170 generic.go:334] "Generic (PLEG): container finished" podID="15bf14df-482c-4bc8-a222-25900daba262" containerID="e67f6eb160137cba7b251a1a03247acafec598302319c03f5f1e18f2586899d1" exitCode=0 Mar 13 01:46:01.236309 master-0 kubenswrapper[19170]: I0313 01:46:01.236305 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-xckgg" event={"ID":"15bf14df-482c-4bc8-a222-25900daba262","Type":"ContainerDied","Data":"e67f6eb160137cba7b251a1a03247acafec598302319c03f5f1e18f2586899d1"} Mar 13 01:46:01.239179 master-0 kubenswrapper[19170]: I0313 01:46:01.239104 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:01.246162 master-0 kubenswrapper[19170]: I0313 01:46:01.246050 19170 generic.go:334] "Generic (PLEG): container finished" podID="01cdc83b-2243-40c6-9709-bb442724340c" containerID="8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b" exitCode=0 Mar 13 01:46:01.246351 master-0 kubenswrapper[19170]: I0313 01:46:01.246246 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerDied","Data":"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b"} Mar 13 01:46:01.246351 master-0 kubenswrapper[19170]: I0313 01:46:01.246271 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"01cdc83b-2243-40c6-9709-bb442724340c","Type":"ContainerDied","Data":"ec31b9566d8856970f6e30ccf382513e2142665e1beb10602ad7524ab9a79ebc"} Mar 13 01:46:01.246351 master-0 kubenswrapper[19170]: I0313 01:46:01.246289 19170 scope.go:117] "RemoveContainer" containerID="3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9" Mar 13 01:46:01.246351 master-0 kubenswrapper[19170]: I0313 01:46:01.246361 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.256554 master-0 kubenswrapper[19170]: I0313 01:46:01.256417 19170 generic.go:334] "Generic (PLEG): container finished" podID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerID="2913e807e28ceb7cd27836e7db44b97fa6b8ac764f2166361b61ab84f7c4f07a" exitCode=0 Mar 13 01:46:01.256554 master-0 kubenswrapper[19170]: I0313 01:46:01.256459 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerDied","Data":"2913e807e28ceb7cd27836e7db44b97fa6b8ac764f2166361b61ab84f7c4f07a"} Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.261927 19170 generic.go:334] "Generic (PLEG): container finished" podID="c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" containerID="ac7e0da1c47b229ae199edcd6263504784836c1d1bba485bd9e9e7c11f1c9e82" exitCode=0 Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.262011 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" event={"ID":"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d","Type":"ContainerDied","Data":"ac7e0da1c47b229ae199edcd6263504784836c1d1bba485bd9e9e7c11f1c9e82"} Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.265545 19170 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/01cdc83b-2243-40c6-9709-bb442724340c-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.265578 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92zdw\" (UniqueName: \"kubernetes.io/projected/01cdc83b-2243-40c6-9709-bb442724340c-kube-api-access-92zdw\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.265590 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.265598 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.265609 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.270991 19170 generic.go:334] "Generic (PLEG): container finished" podID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerID="35c745adaebf698d13d18c92ad1db570ace63e623fda99191e11659ac373fb96" exitCode=0 Mar 13 01:46:01.277292 master-0 kubenswrapper[19170]: I0313 01:46:01.271573 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" event={"ID":"97b91c16-ada4-4c3a-8e56-bd4e6417ded3","Type":"ContainerDied","Data":"35c745adaebf698d13d18c92ad1db570ace63e623fda99191e11659ac373fb96"} Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: I0313 01:46:01.403694 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-556665446f-zdfzl"] Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: E0313 01:46:01.404206 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="probe" Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: I0313 01:46:01.404219 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="probe" Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: E0313 01:46:01.404269 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="cinder-scheduler" Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: I0313 01:46:01.404276 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="cinder-scheduler" Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: I0313 01:46:01.404503 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="cinder-scheduler" Mar 13 01:46:01.404660 master-0 kubenswrapper[19170]: I0313 01:46:01.404528 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cdc83b-2243-40c6-9709-bb442724340c" containerName="probe" Mar 13 01:46:01.408655 master-0 kubenswrapper[19170]: I0313 01:46:01.406091 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.409535 master-0 kubenswrapper[19170]: I0313 01:46:01.409162 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 13 01:46:01.409535 master-0 kubenswrapper[19170]: I0313 01:46:01.409299 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 13 01:46:01.420858 master-0 kubenswrapper[19170]: I0313 01:46:01.420738 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-556665446f-zdfzl"] Mar 13 01:46:01.445052 master-0 kubenswrapper[19170]: I0313 01:46:01.444970 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data" (OuterVolumeSpecName: "config-data") pod "01cdc83b-2243-40c6-9709-bb442724340c" (UID: "01cdc83b-2243-40c6-9709-bb442724340c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:01.467510 master-0 kubenswrapper[19170]: I0313 01:46:01.467340 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" path="/var/lib/kubelet/pods/75f3a8be-2740-470d-994a-e1eecf8327fb/volumes" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.489453 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-scripts\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.489501 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2xbq\" (UniqueName: \"kubernetes.io/projected/d67d826d-f6e4-40eb-aa3f-492586407ee0-kube-api-access-g2xbq\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.489775 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-combined-ca-bundle\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.489854 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490029 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-merged\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490060 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-logs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490099 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d67d826d-f6e4-40eb-aa3f-492586407ee0-etc-podinfo\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490178 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-custom\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490340 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-internal-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.491353 master-0 kubenswrapper[19170]: I0313 01:46:01.490355 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-public-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.494165 master-0 kubenswrapper[19170]: I0313 01:46:01.490446 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01cdc83b-2243-40c6-9709-bb442724340c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:01.541653 master-0 kubenswrapper[19170]: I0313 01:46:01.541591 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4cb08518-316b-4115-97b8-5f0c654b6aad\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1525093-ba79-4899-9c06-240e9f4d5b86\") pod \"ironic-conductor-0\" (UID: \"3785df35-68b7-4d28-8b4a-39c3136ce823\") " pod="openstack/ironic-conductor-0" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.595625 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-scripts\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.595680 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2xbq\" (UniqueName: \"kubernetes.io/projected/d67d826d-f6e4-40eb-aa3f-492586407ee0-kube-api-access-g2xbq\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.595799 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-combined-ca-bundle\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596237 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596321 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-merged\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596352 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-logs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596374 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d67d826d-f6e4-40eb-aa3f-492586407ee0-etc-podinfo\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596422 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-custom\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596620 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-internal-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.596667 master-0 kubenswrapper[19170]: I0313 01:46:01.596653 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-public-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.602266 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-merged\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.602852 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d67d826d-f6e4-40eb-aa3f-492586407ee0-logs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.605285 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data-custom\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.605525 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-public-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.609225 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-scripts\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.609242 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-internal-tls-certs\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.611112 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d67d826d-f6e4-40eb-aa3f-492586407ee0-etc-podinfo\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.619533 master-0 kubenswrapper[19170]: I0313 01:46:01.619092 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-config-data\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.626496 master-0 kubenswrapper[19170]: I0313 01:46:01.624820 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:46:01.632926 master-0 kubenswrapper[19170]: I0313 01:46:01.632611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d67d826d-f6e4-40eb-aa3f-492586407ee0-combined-ca-bundle\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.651106 master-0 kubenswrapper[19170]: I0313 01:46:01.651056 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2xbq\" (UniqueName: \"kubernetes.io/projected/d67d826d-f6e4-40eb-aa3f-492586407ee0-kube-api-access-g2xbq\") pod \"ironic-556665446f-zdfzl\" (UID: \"d67d826d-f6e4-40eb-aa3f-492586407ee0\") " pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.658828 master-0 kubenswrapper[19170]: I0313 01:46:01.658771 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:46:01.685650 master-0 kubenswrapper[19170]: I0313 01:46:01.685074 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:46:01.686995 master-0 kubenswrapper[19170]: I0313 01:46:01.686964 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.692702 master-0 kubenswrapper[19170]: I0313 01:46:01.692668 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-scheduler-config-data" Mar 13 01:46:01.705869 master-0 kubenswrapper[19170]: I0313 01:46:01.705826 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 13 01:46:01.708752 master-0 kubenswrapper[19170]: I0313 01:46:01.708674 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:46:01.802669 master-0 kubenswrapper[19170]: I0313 01:46:01.801798 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.802669 master-0 kubenswrapper[19170]: I0313 01:46:01.801890 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.802669 master-0 kubenswrapper[19170]: I0313 01:46:01.801910 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhmt\" (UniqueName: \"kubernetes.io/projected/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-kube-api-access-xkhmt\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.802669 master-0 kubenswrapper[19170]: I0313 01:46:01.801956 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.803085 master-0 kubenswrapper[19170]: I0313 01:46:01.802883 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.803085 master-0 kubenswrapper[19170]: I0313 01:46:01.802963 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.865832 master-0 kubenswrapper[19170]: I0313 01:46:01.865646 19170 scope.go:117] "RemoveContainer" containerID="8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b" Mar 13 01:46:01.875542 master-0 kubenswrapper[19170]: I0313 01:46:01.875490 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:01.905742 master-0 kubenswrapper[19170]: I0313 01:46:01.905584 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.905960 master-0 kubenswrapper[19170]: I0313 01:46:01.905845 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhmt\" (UniqueName: \"kubernetes.io/projected/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-kube-api-access-xkhmt\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.905960 master-0 kubenswrapper[19170]: I0313 01:46:01.905880 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.905960 master-0 kubenswrapper[19170]: I0313 01:46:01.905943 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.906552 master-0 kubenswrapper[19170]: I0313 01:46:01.906057 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.906552 master-0 kubenswrapper[19170]: I0313 01:46:01.906166 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.907265 master-0 kubenswrapper[19170]: I0313 01:46:01.907198 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-etc-machine-id\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.909722 master-0 kubenswrapper[19170]: I0313 01:46:01.909665 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data-custom\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.916137 master-0 kubenswrapper[19170]: I0313 01:46:01.915768 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-combined-ca-bundle\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.917839 master-0 kubenswrapper[19170]: I0313 01:46:01.917798 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-scripts\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.924737 master-0 kubenswrapper[19170]: I0313 01:46:01.924699 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-config-data\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.926762 master-0 kubenswrapper[19170]: I0313 01:46:01.926729 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhmt\" (UniqueName: \"kubernetes.io/projected/c37c3652-7288-40f4-9a80-9cbf4a23fdf0-kube-api-access-xkhmt\") pod \"cinder-051b7-scheduler-0\" (UID: \"c37c3652-7288-40f4-9a80-9cbf4a23fdf0\") " pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:01.927939 master-0 kubenswrapper[19170]: I0313 01:46:01.927886 19170 scope.go:117] "RemoveContainer" containerID="3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9" Mar 13 01:46:01.928564 master-0 kubenswrapper[19170]: E0313 01:46:01.928513 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9\": container with ID starting with 3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9 not found: ID does not exist" containerID="3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9" Mar 13 01:46:01.928710 master-0 kubenswrapper[19170]: I0313 01:46:01.928561 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9"} err="failed to get container status \"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9\": rpc error: code = NotFound desc = could not find container \"3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9\": container with ID starting with 3eed686a2face2531b3b9085d7beece990680252e5928f6b8f4af051572e64e9 not found: ID does not exist" Mar 13 01:46:01.928710 master-0 kubenswrapper[19170]: I0313 01:46:01.928592 19170 scope.go:117] "RemoveContainer" containerID="8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b" Mar 13 01:46:01.929011 master-0 kubenswrapper[19170]: E0313 01:46:01.928967 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b\": container with ID starting with 8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b not found: ID does not exist" containerID="8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b" Mar 13 01:46:01.929069 master-0 kubenswrapper[19170]: I0313 01:46:01.929006 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b"} err="failed to get container status \"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b\": rpc error: code = NotFound desc = could not find container \"8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b\": container with ID starting with 8aba9ac1e3b085e7c04cb214ea7f8e75329fdcb19d7d852f11d653fb879f9e1b not found: ID does not exist" Mar 13 01:46:02.005546 master-0 kubenswrapper[19170]: I0313 01:46:02.005439 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:02.293482 master-0 kubenswrapper[19170]: I0313 01:46:02.293341 19170 generic.go:334] "Generic (PLEG): container finished" podID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerID="15e6bcf32a26bd96875fa2be8cdd04200634595fb86139799244a034f1c1bd7d" exitCode=0 Mar 13 01:46:02.293482 master-0 kubenswrapper[19170]: I0313 01:46:02.293432 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerDied","Data":"15e6bcf32a26bd96875fa2be8cdd04200634595fb86139799244a034f1c1bd7d"} Mar 13 01:46:02.296842 master-0 kubenswrapper[19170]: I0313 01:46:02.296794 19170 generic.go:334] "Generic (PLEG): container finished" podID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerID="ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0" exitCode=0 Mar 13 01:46:02.296919 master-0 kubenswrapper[19170]: I0313 01:46:02.296873 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerDied","Data":"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0"} Mar 13 01:46:02.940805 master-0 kubenswrapper[19170]: I0313 01:46:02.940747 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 01:46:02.956342 master-0 kubenswrapper[19170]: I0313 01:46:02.956294 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 13 01:46:03.128152 master-0 kubenswrapper[19170]: I0313 01:46:03.128106 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:03.154120 master-0 kubenswrapper[19170]: I0313 01:46:03.153278 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:46:03.195359 master-0 kubenswrapper[19170]: I0313 01:46:03.195316 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.234781 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.234893 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.234934 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.234990 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pss8h\" (UniqueName: \"kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h\") pod \"15bf14df-482c-4bc8-a222-25900daba262\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235045 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235168 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235200 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235221 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235260 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235282 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235317 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235343 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235369 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfplb\" (UniqueName: \"kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235503 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235538 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pscwm\" (UniqueName: \"kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm\") pod \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " Mar 13 01:46:03.235660 master-0 kubenswrapper[19170]: I0313 01:46:03.235608 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.236342 master-0 kubenswrapper[19170]: I0313 01:46:03.235703 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick\") pod \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\" (UID: \"dd118f31-fa5b-466a-a7a3-0bd0e788bd38\") " Mar 13 01:46:03.236342 master-0 kubenswrapper[19170]: I0313 01:46:03.235769 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts\") pod \"15bf14df-482c-4bc8-a222-25900daba262\" (UID: \"15bf14df-482c-4bc8-a222-25900daba262\") " Mar 13 01:46:03.236342 master-0 kubenswrapper[19170]: I0313 01:46:03.235830 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts\") pod \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\" (UID: \"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d\") " Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237438 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" (UID: "c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237523 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237584 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237606 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237677 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys" (OuterVolumeSpecName: "sys") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237796 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237849 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev" (OuterVolumeSpecName: "dev") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237819 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.237912 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.238054 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run" (OuterVolumeSpecName: "run") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.238121 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:03.246493 master-0 kubenswrapper[19170]: I0313 01:46:03.241127 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15bf14df-482c-4bc8-a222-25900daba262" (UID: "15bf14df-482c-4bc8-a222-25900daba262"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:03.254055 master-0 kubenswrapper[19170]: I0313 01:46:03.251387 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h" (OuterVolumeSpecName: "kube-api-access-pss8h") pod "15bf14df-482c-4bc8-a222-25900daba262" (UID: "15bf14df-482c-4bc8-a222-25900daba262"). InnerVolumeSpecName "kube-api-access-pss8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:03.254055 master-0 kubenswrapper[19170]: I0313 01:46:03.252192 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts" (OuterVolumeSpecName: "scripts") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:03.254055 master-0 kubenswrapper[19170]: I0313 01:46:03.252606 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb" (OuterVolumeSpecName: "kube-api-access-kfplb") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "kube-api-access-kfplb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:03.254055 master-0 kubenswrapper[19170]: I0313 01:46:03.252705 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm" (OuterVolumeSpecName: "kube-api-access-pscwm") pod "c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" (UID: "c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d"). InnerVolumeSpecName "kube-api-access-pscwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:03.258653 master-0 kubenswrapper[19170]: I0313 01:46:03.257679 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:03.323827 master-0 kubenswrapper[19170]: I0313 01:46:03.319905 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerStarted","Data":"240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1"} Mar 13 01:46:03.325294 master-0 kubenswrapper[19170]: I0313 01:46:03.325253 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-xckgg" event={"ID":"15bf14df-482c-4bc8-a222-25900daba262","Type":"ContainerDied","Data":"ba7f7a4786d0a732b2a26c9556818342343b0b6c54ebdae306f738020d1fecfb"} Mar 13 01:46:03.325294 master-0 kubenswrapper[19170]: I0313 01:46:03.325291 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7f7a4786d0a732b2a26c9556818342343b0b6c54ebdae306f738020d1fecfb" Mar 13 01:46:03.325419 master-0 kubenswrapper[19170]: I0313 01:46:03.325342 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-xckgg" Mar 13 01:46:03.337802 master-0 kubenswrapper[19170]: I0313 01:46:03.335874 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"dd118f31-fa5b-466a-a7a3-0bd0e788bd38","Type":"ContainerDied","Data":"90d3a8238a7ed55324b42a0b45944a3d67f5d22099d35ec322081abe69a51584"} Mar 13 01:46:03.337802 master-0 kubenswrapper[19170]: I0313 01:46:03.335948 19170 scope.go:117] "RemoveContainer" containerID="15e6bcf32a26bd96875fa2be8cdd04200634595fb86139799244a034f1c1bd7d" Mar 13 01:46:03.337802 master-0 kubenswrapper[19170]: I0313 01:46:03.336192 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339045 19170 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339071 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bf14df-482c-4bc8-a222-25900daba262-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339084 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339096 19170 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339107 19170 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339116 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339127 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pss8h\" (UniqueName: \"kubernetes.io/projected/15bf14df-482c-4bc8-a222-25900daba262-kube-api-access-pss8h\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339136 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339144 19170 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339152 19170 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-dev\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339161 19170 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339201 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339213 19170 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-sys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339223 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfplb\" (UniqueName: \"kubernetes.io/projected/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-kube-api-access-kfplb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339255 master-0 kubenswrapper[19170]: I0313 01:46:03.339232 19170 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339783 master-0 kubenswrapper[19170]: I0313 01:46:03.339287 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pscwm\" (UniqueName: \"kubernetes.io/projected/c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d-kube-api-access-pscwm\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.339783 master-0 kubenswrapper[19170]: I0313 01:46:03.339298 19170 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.341968 master-0 kubenswrapper[19170]: I0313 01:46:03.341805 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" event={"ID":"c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d","Type":"ContainerDied","Data":"92e17d8d977df013c61dd97238eb2783b1bbcbbf92650a181ed851406d2e16dc"} Mar 13 01:46:03.342342 master-0 kubenswrapper[19170]: I0313 01:46:03.342310 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92e17d8d977df013c61dd97238eb2783b1bbcbbf92650a181ed851406d2e16dc" Mar 13 01:46:03.342342 master-0 kubenswrapper[19170]: I0313 01:46:03.342141 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-f67c-account-create-update-4h2l7" Mar 13 01:46:03.389909 master-0 kubenswrapper[19170]: I0313 01:46:03.389862 19170 scope.go:117] "RemoveContainer" containerID="2913e807e28ceb7cd27836e7db44b97fa6b8ac764f2166361b61ab84f7c4f07a" Mar 13 01:46:03.440655 master-0 kubenswrapper[19170]: I0313 01:46:03.440184 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cdc83b-2243-40c6-9709-bb442724340c" path="/var/lib/kubelet/pods/01cdc83b-2243-40c6-9709-bb442724340c/volumes" Mar 13 01:46:03.609275 master-0 kubenswrapper[19170]: I0313 01:46:03.604925 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848b9c6b49-shsxt" podUID="75f3a8be-2740-470d-994a-e1eecf8327fb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.232:5353: i/o timeout" Mar 13 01:46:03.611586 master-0 kubenswrapper[19170]: I0313 01:46:03.611518 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:03.641017 master-0 kubenswrapper[19170]: W0313 01:46:03.640940 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc37c3652_7288_40f4_9a80_9cbf4a23fdf0.slice/crio-5a083ae4a068a951c9227dd4d1ed7407a657019978b9fea040d344cbe1347ccc WatchSource:0}: Error finding container 5a083ae4a068a951c9227dd4d1ed7407a657019978b9fea040d344cbe1347ccc: Status 404 returned error can't find the container with id 5a083ae4a068a951c9227dd4d1ed7407a657019978b9fea040d344cbe1347ccc Mar 13 01:46:03.659764 master-0 kubenswrapper[19170]: I0313 01:46:03.658772 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.719912 master-0 kubenswrapper[19170]: I0313 01:46:03.719858 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data" (OuterVolumeSpecName: "config-data") pod "dd118f31-fa5b-466a-a7a3-0bd0e788bd38" (UID: "dd118f31-fa5b-466a-a7a3-0bd0e788bd38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:03.760930 master-0 kubenswrapper[19170]: I0313 01:46:03.760827 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd118f31-fa5b-466a-a7a3-0bd0e788bd38-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:03.797117 master-0 kubenswrapper[19170]: I0313 01:46:03.797069 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 13 01:46:03.797117 master-0 kubenswrapper[19170]: I0313 01:46:03.797108 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-scheduler-0"] Mar 13 01:46:03.797117 master-0 kubenswrapper[19170]: I0313 01:46:03.797120 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-556665446f-zdfzl"] Mar 13 01:46:03.997854 master-0 kubenswrapper[19170]: I0313 01:46:03.996979 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:46:04.038479 master-0 kubenswrapper[19170]: I0313 01:46:04.037272 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.063775 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: E0313 01:46:04.064330 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" containerName="mariadb-account-create-update" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064346 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" containerName="mariadb-account-create-update" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: E0313 01:46:04.064362 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="cinder-volume" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064368 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="cinder-volume" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: E0313 01:46:04.064388 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="probe" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064396 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="probe" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: E0313 01:46:04.064443 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bf14df-482c-4bc8-a222-25900daba262" containerName="mariadb-database-create" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064449 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bf14df-482c-4bc8-a222-25900daba262" containerName="mariadb-database-create" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064685 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="cinder-volume" Mar 13 01:46:04.064703 master-0 kubenswrapper[19170]: I0313 01:46:04.064696 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="15bf14df-482c-4bc8-a222-25900daba262" containerName="mariadb-database-create" Mar 13 01:46:04.065175 master-0 kubenswrapper[19170]: I0313 01:46:04.064736 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="c72cc566-7a61-49ac-bdf3-a2d7e0f0e87d" containerName="mariadb-account-create-update" Mar 13 01:46:04.065175 master-0 kubenswrapper[19170]: I0313 01:46:04.064750 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" containerName="probe" Mar 13 01:46:04.068844 master-0 kubenswrapper[19170]: I0313 01:46:04.065907 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.075065 master-0 kubenswrapper[19170]: I0313 01:46:04.075009 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-volume-lvm-iscsi-config-data" Mar 13 01:46:04.078337 master-0 kubenswrapper[19170]: I0313 01:46:04.078282 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:46:04.177394 master-0 kubenswrapper[19170]: I0313 01:46:04.177325 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.177394 master-0 kubenswrapper[19170]: I0313 01:46:04.177396 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.177839 master-0 kubenswrapper[19170]: I0313 01:46:04.177420 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.177839 master-0 kubenswrapper[19170]: I0313 01:46:04.177686 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.177839 master-0 kubenswrapper[19170]: I0313 01:46:04.177774 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178010 master-0 kubenswrapper[19170]: I0313 01:46:04.177859 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178010 master-0 kubenswrapper[19170]: I0313 01:46:04.177901 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178010 master-0 kubenswrapper[19170]: I0313 01:46:04.177982 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178166 master-0 kubenswrapper[19170]: I0313 01:46:04.178079 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178166 master-0 kubenswrapper[19170]: I0313 01:46:04.178115 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178166 master-0 kubenswrapper[19170]: I0313 01:46:04.178136 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178166 master-0 kubenswrapper[19170]: I0313 01:46:04.178150 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178166 master-0 kubenswrapper[19170]: I0313 01:46:04.178170 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcws\" (UniqueName: \"kubernetes.io/projected/bb5d5951-9545-4252-b3aa-2aa48994edf0-kube-api-access-wbcws\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178397 master-0 kubenswrapper[19170]: I0313 01:46:04.178204 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.178397 master-0 kubenswrapper[19170]: I0313 01:46:04.178246 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280371 master-0 kubenswrapper[19170]: I0313 01:46:04.280292 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280371 master-0 kubenswrapper[19170]: I0313 01:46:04.280366 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280386 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280420 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280437 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcws\" (UniqueName: \"kubernetes.io/projected/bb5d5951-9545-4252-b3aa-2aa48994edf0-kube-api-access-wbcws\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280458 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280498 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280588 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280651 master-0 kubenswrapper[19170]: I0313 01:46:04.280620 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280692 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280733 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280773 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280805 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280841 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.280892 master-0 kubenswrapper[19170]: I0313 01:46:04.280868 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.281053 master-0 kubenswrapper[19170]: I0313 01:46:04.280993 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-machine-id\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.281053 master-0 kubenswrapper[19170]: I0313 01:46:04.281031 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-sys\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.281113 master-0 kubenswrapper[19170]: I0313 01:46:04.281089 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.281147 master-0 kubenswrapper[19170]: I0313 01:46:04.281124 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-nvme\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.281197 master-0 kubenswrapper[19170]: I0313 01:46:04.281174 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-lib-cinder\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.283488 master-0 kubenswrapper[19170]: I0313 01:46:04.281839 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-run\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.283488 master-0 kubenswrapper[19170]: I0313 01:46:04.282036 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-etc-iscsi\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.283488 master-0 kubenswrapper[19170]: I0313 01:46:04.282369 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-lib-modules\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.283488 master-0 kubenswrapper[19170]: I0313 01:46:04.282456 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-dev\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.283488 master-0 kubenswrapper[19170]: I0313 01:46:04.282523 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bb5d5951-9545-4252-b3aa-2aa48994edf0-var-locks-brick\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.285769 master-0 kubenswrapper[19170]: I0313 01:46:04.285744 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.295068 master-0 kubenswrapper[19170]: I0313 01:46:04.294969 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-config-data-custom\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.296576 master-0 kubenswrapper[19170]: I0313 01:46:04.296528 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-scripts\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.298471 master-0 kubenswrapper[19170]: I0313 01:46:04.298434 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb5d5951-9545-4252-b3aa-2aa48994edf0-combined-ca-bundle\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.303667 master-0 kubenswrapper[19170]: I0313 01:46:04.302262 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcws\" (UniqueName: \"kubernetes.io/projected/bb5d5951-9545-4252-b3aa-2aa48994edf0-kube-api-access-wbcws\") pod \"cinder-051b7-volume-lvm-iscsi-0\" (UID: \"bb5d5951-9545-4252-b3aa-2aa48994edf0\") " pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.377959 master-0 kubenswrapper[19170]: I0313 01:46:04.377805 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"c37c3652-7288-40f4-9a80-9cbf4a23fdf0","Type":"ContainerStarted","Data":"5a083ae4a068a951c9227dd4d1ed7407a657019978b9fea040d344cbe1347ccc"} Mar 13 01:46:04.393692 master-0 kubenswrapper[19170]: I0313 01:46:04.391434 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" event={"ID":"97b91c16-ada4-4c3a-8e56-bd4e6417ded3","Type":"ContainerStarted","Data":"6a0c643881c9cd9e4fca984b71071c1158db3c62332f3e5c8f45ac5978ca7e5d"} Mar 13 01:46:04.393692 master-0 kubenswrapper[19170]: I0313 01:46:04.392799 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:46:04.394687 master-0 kubenswrapper[19170]: I0313 01:46:04.394652 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"2d5fe2df526094d9f0bc0ebb04ba4aabffe71f62f8bab3158203be71c1b3976f"} Mar 13 01:46:04.394751 master-0 kubenswrapper[19170]: I0313 01:46:04.394696 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"e00f5d65efccfac4536c86ceb63c086480905310cb6f43cb104aee2d27b8b164"} Mar 13 01:46:04.402715 master-0 kubenswrapper[19170]: I0313 01:46:04.397734 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:04.403476 master-0 kubenswrapper[19170]: I0313 01:46:04.402987 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerStarted","Data":"8fb4ea3c62ecb45c586bedb078fb9dbcd4289bb78ac351f9df89e9e1d574da1a"} Mar 13 01:46:04.403999 master-0 kubenswrapper[19170]: I0313 01:46:04.403971 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:04.408120 master-0 kubenswrapper[19170]: I0313 01:46:04.408084 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-556665446f-zdfzl" event={"ID":"d67d826d-f6e4-40eb-aa3f-492586407ee0","Type":"ContainerStarted","Data":"d283b7c80747e9af98ed05b6811e0c31836102e61e08108a11e8da3cb81279a8"} Mar 13 01:46:04.408120 master-0 kubenswrapper[19170]: I0313 01:46:04.408121 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-556665446f-zdfzl" event={"ID":"d67d826d-f6e4-40eb-aa3f-492586407ee0","Type":"ContainerStarted","Data":"ac61b784c00ce80bc028cd5a6a0f84e6d2001aee5af1599cd970ae88abfa4db2"} Mar 13 01:46:04.411816 master-0 kubenswrapper[19170]: I0313 01:46:04.411696 19170 generic.go:334] "Generic (PLEG): container finished" podID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerID="240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1" exitCode=1 Mar 13 01:46:04.411816 master-0 kubenswrapper[19170]: I0313 01:46:04.411746 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1"} Mar 13 01:46:04.430927 master-0 kubenswrapper[19170]: I0313 01:46:04.428624 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" podStartSLOduration=7.428605056 podStartE2EDuration="7.428605056s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:04.416668119 +0000 UTC m=+1625.224789099" watchObservedRunningTime="2026-03-13 01:46:04.428605056 +0000 UTC m=+1625.236726016" Mar 13 01:46:04.482965 master-0 kubenswrapper[19170]: I0313 01:46:04.482861 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" podStartSLOduration=3.7447526780000002 podStartE2EDuration="7.482831854s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="2026-03-13 01:45:59.167293014 +0000 UTC m=+1619.975413974" lastFinishedPulling="2026-03-13 01:46:02.90537219 +0000 UTC m=+1623.713493150" observedRunningTime="2026-03-13 01:46:04.466281368 +0000 UTC m=+1625.274402338" watchObservedRunningTime="2026-03-13 01:46:04.482831854 +0000 UTC m=+1625.290952814" Mar 13 01:46:05.028915 master-0 kubenswrapper[19170]: I0313 01:46:05.028841 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-volume-lvm-iscsi-0"] Mar 13 01:46:05.300398 master-0 kubenswrapper[19170]: I0313 01:46:05.300368 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440188 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440280 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440308 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440324 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440419 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440465 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440532 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440589 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440707 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.440733 master-0 kubenswrapper[19170]: I0313 01:46:05.440731 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440730 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440791 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440796 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k62s8\" (UniqueName: \"kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440848 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440944 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.440988 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.442652 master-0 kubenswrapper[19170]: I0313 01:46:05.441014 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder\") pod \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\" (UID: \"6c8289d1-5df7-4e68-b0b3-ea797ce78d32\") " Mar 13 01:46:05.443481 master-0 kubenswrapper[19170]: I0313 01:46:05.443237 19170 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.443481 master-0 kubenswrapper[19170]: I0313 01:46:05.443265 19170 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.443481 master-0 kubenswrapper[19170]: I0313 01:46:05.443295 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.443481 master-0 kubenswrapper[19170]: I0313 01:46:05.443320 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.448765 master-0 kubenswrapper[19170]: I0313 01:46:05.447911 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8" (OuterVolumeSpecName: "kube-api-access-k62s8") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "kube-api-access-k62s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:05.463258 master-0 kubenswrapper[19170]: I0313 01:46:05.463037 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts" (OuterVolumeSpecName: "scripts") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:05.463442 master-0 kubenswrapper[19170]: I0313 01:46:05.463386 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.463988 master-0 kubenswrapper[19170]: I0313 01:46:05.463748 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run" (OuterVolumeSpecName: "run") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.463988 master-0 kubenswrapper[19170]: I0313 01:46:05.463821 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev" (OuterVolumeSpecName: "dev") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.463988 master-0 kubenswrapper[19170]: I0313 01:46:05.463841 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys" (OuterVolumeSpecName: "sys") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.463988 master-0 kubenswrapper[19170]: I0313 01:46:05.463861 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.463988 master-0 kubenswrapper[19170]: I0313 01:46:05.463883 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 01:46:05.475399 master-0 kubenswrapper[19170]: I0313 01:46:05.475357 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd118f31-fa5b-466a-a7a3-0bd0e788bd38" path="/var/lib/kubelet/pods/dd118f31-fa5b-466a-a7a3-0bd0e788bd38/volumes" Mar 13 01:46:05.480979 master-0 kubenswrapper[19170]: I0313 01:46:05.480931 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:05.497534 master-0 kubenswrapper[19170]: I0313 01:46:05.497452 19170 generic.go:334] "Generic (PLEG): container finished" podID="d67d826d-f6e4-40eb-aa3f-492586407ee0" containerID="d283b7c80747e9af98ed05b6811e0c31836102e61e08108a11e8da3cb81279a8" exitCode=0 Mar 13 01:46:05.504525 master-0 kubenswrapper[19170]: I0313 01:46:05.502922 19170 generic.go:334] "Generic (PLEG): container finished" podID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerID="6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e" exitCode=0 Mar 13 01:46:05.504525 master-0 kubenswrapper[19170]: I0313 01:46:05.503073 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544749 19170 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544786 19170 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544796 19170 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-dev\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544805 19170 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-sys\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544813 19170 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544822 19170 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544830 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k62s8\" (UniqueName: \"kubernetes.io/projected/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-kube-api-access-k62s8\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544839 19170 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544848 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544858 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.544988 master-0 kubenswrapper[19170]: I0313 01:46:05.544866 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.557182 master-0 kubenswrapper[19170]: I0313 01:46:05.556780 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:05.651354 master-0 kubenswrapper[19170]: I0313 01:46:05.651275 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.675777 master-0 kubenswrapper[19170]: I0313 01:46:05.675724 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-556665446f-zdfzl" event={"ID":"d67d826d-f6e4-40eb-aa3f-492586407ee0","Type":"ContainerDied","Data":"d283b7c80747e9af98ed05b6811e0c31836102e61e08108a11e8da3cb81279a8"} Mar 13 01:46:05.675777 master-0 kubenswrapper[19170]: I0313 01:46:05.675776 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerDied","Data":"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e"} Mar 13 01:46:05.675777 master-0 kubenswrapper[19170]: I0313 01:46:05.675796 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"6c8289d1-5df7-4e68-b0b3-ea797ce78d32","Type":"ContainerDied","Data":"d2e068efbe66c2b3b12f03087018d5dde46a746fd2992d5561331ffc261cb979"} Mar 13 01:46:05.675961 master-0 kubenswrapper[19170]: I0313 01:46:05.675807 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"c37c3652-7288-40f4-9a80-9cbf4a23fdf0","Type":"ContainerStarted","Data":"9df76b2373daa9f596a70b43f39971518e33de4e72c4a4d5185a83f7b94e7bca"} Mar 13 01:46:05.675961 master-0 kubenswrapper[19170]: I0313 01:46:05.675819 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"bb5d5951-9545-4252-b3aa-2aa48994edf0","Type":"ContainerStarted","Data":"9d2ef3d65a08ce359c5e7837dbdc43ae0b762c05e200bf9602dceaee6b230804"} Mar 13 01:46:05.675961 master-0 kubenswrapper[19170]: I0313 01:46:05.675831 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerStarted","Data":"95e0b854bbd0ee6b426a81ebb0f5c3c0f0d159f00119b8a6ede5895cf4c8ab94"} Mar 13 01:46:05.675961 master-0 kubenswrapper[19170]: I0313 01:46:05.675856 19170 scope.go:117] "RemoveContainer" containerID="ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0" Mar 13 01:46:05.696995 master-0 kubenswrapper[19170]: I0313 01:46:05.696950 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data" (OuterVolumeSpecName: "config-data") pod "6c8289d1-5df7-4e68-b0b3-ea797ce78d32" (UID: "6c8289d1-5df7-4e68-b0b3-ea797ce78d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:05.741850 master-0 kubenswrapper[19170]: I0313 01:46:05.738846 19170 scope.go:117] "RemoveContainer" containerID="6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e" Mar 13 01:46:05.755276 master-0 kubenswrapper[19170]: I0313 01:46:05.755172 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8289d1-5df7-4e68-b0b3-ea797ce78d32-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:05.779965 master-0 kubenswrapper[19170]: I0313 01:46:05.776807 19170 scope.go:117] "RemoveContainer" containerID="ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0" Mar 13 01:46:05.780769 master-0 kubenswrapper[19170]: E0313 01:46:05.780724 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0\": container with ID starting with ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0 not found: ID does not exist" containerID="ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0" Mar 13 01:46:05.780831 master-0 kubenswrapper[19170]: I0313 01:46:05.780769 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0"} err="failed to get container status \"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0\": rpc error: code = NotFound desc = could not find container \"ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0\": container with ID starting with ac7ecbed7e9faf9da3b15c4add3d82c7956a60dc56cf5a627ecc7791a11c85d0 not found: ID does not exist" Mar 13 01:46:05.780831 master-0 kubenswrapper[19170]: I0313 01:46:05.780798 19170 scope.go:117] "RemoveContainer" containerID="6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e" Mar 13 01:46:05.785769 master-0 kubenswrapper[19170]: E0313 01:46:05.785731 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e\": container with ID starting with 6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e not found: ID does not exist" containerID="6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e" Mar 13 01:46:05.785843 master-0 kubenswrapper[19170]: I0313 01:46:05.785786 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e"} err="failed to get container status \"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e\": rpc error: code = NotFound desc = could not find container \"6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e\": container with ID starting with 6e8844314fedbd5f0ce4bd82249034233ffb9b589613da70d1272dba9290d61e not found: ID does not exist" Mar 13 01:46:05.856789 master-0 kubenswrapper[19170]: I0313 01:46:05.856704 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:46:05.887154 master-0 kubenswrapper[19170]: I0313 01:46:05.887109 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:46:05.905515 master-0 kubenswrapper[19170]: I0313 01:46:05.905472 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: E0313 01:46:05.906251 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="cinder-backup" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: I0313 01:46:05.906272 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="cinder-backup" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: E0313 01:46:05.906331 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="probe" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: I0313 01:46:05.906340 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="probe" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: I0313 01:46:05.906577 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="probe" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: I0313 01:46:05.906618 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" containerName="cinder-backup" Mar 13 01:46:05.907927 master-0 kubenswrapper[19170]: I0313 01:46:05.907895 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:05.929412 master-0 kubenswrapper[19170]: I0313 01:46:05.913121 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:46:05.960525 master-0 kubenswrapper[19170]: I0313 01:46:05.953560 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-051b7-backup-config-data" Mar 13 01:46:06.064234 master-0 kubenswrapper[19170]: I0313 01:46:06.064171 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.064505 master-0 kubenswrapper[19170]: I0313 01:46:06.064491 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.064613 master-0 kubenswrapper[19170]: I0313 01:46:06.064599 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-dev\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.064725 master-0 kubenswrapper[19170]: I0313 01:46:06.064710 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-sys\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.064840 master-0 kubenswrapper[19170]: I0313 01:46:06.064822 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.064995 master-0 kubenswrapper[19170]: I0313 01:46:06.064967 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.065087 master-0 kubenswrapper[19170]: I0313 01:46:06.065074 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.065225 master-0 kubenswrapper[19170]: I0313 01:46:06.065203 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.065330 master-0 kubenswrapper[19170]: I0313 01:46:06.065317 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066015 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-run\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066121 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066252 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066332 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfp5b\" (UniqueName: \"kubernetes.io/projected/7c04c955-8b3a-4d71-9889-066b1a0732a6-kube-api-access-mfp5b\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066441 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.070663 master-0 kubenswrapper[19170]: I0313 01:46:06.066495 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168298 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168362 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-dev\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168398 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-sys\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168419 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168461 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168484 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168523 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168555 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168570 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-run\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168589 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168622 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.168752 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-sys\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169192 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-lib-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169303 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-lib-modules\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169338 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-brick\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169368 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-nvme\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169384 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-run\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169409 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-var-locks-cinder\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169428 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-dev\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169444 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-iscsi\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169559 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfp5b\" (UniqueName: \"kubernetes.io/projected/7c04c955-8b3a-4d71-9889-066b1a0732a6-kube-api-access-mfp5b\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169602 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169626 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169679 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.170289 master-0 kubenswrapper[19170]: I0313 01:46:06.169980 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7c04c955-8b3a-4d71-9889-066b1a0732a6-etc-machine-id\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.181672 master-0 kubenswrapper[19170]: I0313 01:46:06.181140 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.185675 master-0 kubenswrapper[19170]: I0313 01:46:06.184897 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-scripts\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.191648 master-0 kubenswrapper[19170]: I0313 01:46:06.187030 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-combined-ca-bundle\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.196745 master-0 kubenswrapper[19170]: I0313 01:46:06.193487 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7c04c955-8b3a-4d71-9889-066b1a0732a6-config-data-custom\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.235910 master-0 kubenswrapper[19170]: I0313 01:46:06.230369 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfp5b\" (UniqueName: \"kubernetes.io/projected/7c04c955-8b3a-4d71-9889-066b1a0732a6-kube-api-access-mfp5b\") pod \"cinder-051b7-backup-0\" (UID: \"7c04c955-8b3a-4d71-9889-066b1a0732a6\") " pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.292753 master-0 kubenswrapper[19170]: I0313 01:46:06.292623 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:06.611194 master-0 kubenswrapper[19170]: I0313 01:46:06.608798 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-scheduler-0" event={"ID":"c37c3652-7288-40f4-9a80-9cbf4a23fdf0","Type":"ContainerStarted","Data":"a003320d1c880af40905affac501a14e24f72a68c4b37568f16160bacbccd272"} Mar 13 01:46:06.638929 master-0 kubenswrapper[19170]: I0313 01:46:06.636872 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-scheduler-0" podStartSLOduration=5.636855851 podStartE2EDuration="5.636855851s" podCreationTimestamp="2026-03-13 01:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:06.63504944 +0000 UTC m=+1627.443170400" watchObservedRunningTime="2026-03-13 01:46:06.636855851 +0000 UTC m=+1627.444976811" Mar 13 01:46:06.647493 master-0 kubenswrapper[19170]: I0313 01:46:06.644466 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-556665446f-zdfzl" event={"ID":"d67d826d-f6e4-40eb-aa3f-492586407ee0","Type":"ContainerStarted","Data":"6ece1eaa02653a4c4483498a0cfdc7a1f3532e8b8ab2e4a6eec7baa08c92a51e"} Mar 13 01:46:06.647493 master-0 kubenswrapper[19170]: I0313 01:46:06.644535 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-556665446f-zdfzl" event={"ID":"d67d826d-f6e4-40eb-aa3f-492586407ee0","Type":"ContainerStarted","Data":"ed2e7358ebc6b35e8e75dbeb0baffaee01930a0652d7d75fdc901ee230b473a9"} Mar 13 01:46:06.647493 master-0 kubenswrapper[19170]: I0313 01:46:06.646056 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:06.666142 master-0 kubenswrapper[19170]: I0313 01:46:06.666099 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"bb5d5951-9545-4252-b3aa-2aa48994edf0","Type":"ContainerStarted","Data":"c89908334f7a1853dd77977879f6967f5e78f7426773be7d7ed629b44d2eeb6d"} Mar 13 01:46:06.666375 master-0 kubenswrapper[19170]: I0313 01:46:06.666360 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" event={"ID":"bb5d5951-9545-4252-b3aa-2aa48994edf0","Type":"ContainerStarted","Data":"c279fb7dbce272cb7e6b616ba776cbe7218ed61f093c769fcf260961cea5fd53"} Mar 13 01:46:06.673485 master-0 kubenswrapper[19170]: I0313 01:46:06.673430 19170 generic.go:334] "Generic (PLEG): container finished" podID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerID="95e0b854bbd0ee6b426a81ebb0f5c3c0f0d159f00119b8a6ede5895cf4c8ab94" exitCode=0 Mar 13 01:46:06.673745 master-0 kubenswrapper[19170]: I0313 01:46:06.673726 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"95e0b854bbd0ee6b426a81ebb0f5c3c0f0d159f00119b8a6ede5895cf4c8ab94"} Mar 13 01:46:06.673856 master-0 kubenswrapper[19170]: I0313 01:46:06.673843 19170 scope.go:117] "RemoveContainer" containerID="240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1" Mar 13 01:46:06.690275 master-0 kubenswrapper[19170]: I0313 01:46:06.690227 19170 scope.go:117] "RemoveContainer" containerID="240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1" Mar 13 01:46:06.726657 master-0 kubenswrapper[19170]: I0313 01:46:06.723513 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-556665446f-zdfzl" podStartSLOduration=5.723496644 podStartE2EDuration="5.723496644s" podCreationTimestamp="2026-03-13 01:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:06.683025413 +0000 UTC m=+1627.491146373" watchObservedRunningTime="2026-03-13 01:46:06.723496644 +0000 UTC m=+1627.531617594" Mar 13 01:46:06.777035 master-0 kubenswrapper[19170]: I0313 01:46:06.776990 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:06.789290 master-0 kubenswrapper[19170]: I0313 01:46:06.789198 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" podStartSLOduration=3.789178855 podStartE2EDuration="3.789178855s" podCreationTimestamp="2026-03-13 01:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:06.76417928 +0000 UTC m=+1627.572300240" watchObservedRunningTime="2026-03-13 01:46:06.789178855 +0000 UTC m=+1627.597299815" Mar 13 01:46:06.800875 master-0 kubenswrapper[19170]: E0313 01:46:06.794825 19170 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_init_ironic-86cc47d7f5-mk8sb_openstack_bcadd2d8-9133-4e14-af9d-1c42095f91f5_0 in pod sandbox 4865a89b78ff8fbceda9b5950940d733b12f5cd684365b375f8d75f88f1e0833 from index: no such id: '240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1'" containerID="240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1" Mar 13 01:46:06.800875 master-0 kubenswrapper[19170]: E0313 01:46:06.794873 19170 kuberuntime_container.go:896] "Unhandled Error" err="failed to remove pod init container \"init\": rpc error: code = Unknown desc = failed to delete container k8s_init_ironic-86cc47d7f5-mk8sb_openstack_bcadd2d8-9133-4e14-af9d-1c42095f91f5_0 in pod sandbox 4865a89b78ff8fbceda9b5950940d733b12f5cd684365b375f8d75f88f1e0833 from index: no such id: '240643fdb7a34753c23fdfea02608c02d8b27d4ac15badf6eec96b67f7236ea1'; Skipping pod \"ironic-86cc47d7f5-mk8sb_openstack(bcadd2d8-9133-4e14-af9d-1c42095f91f5)\"" logger="UnhandledError" Mar 13 01:46:07.011663 master-0 kubenswrapper[19170]: I0313 01:46:07.009938 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:07.053153 master-0 kubenswrapper[19170]: I0313 01:46:07.050093 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-051b7-backup-0"] Mar 13 01:46:07.438044 master-0 kubenswrapper[19170]: I0313 01:46:07.437978 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8289d1-5df7-4e68-b0b3-ea797ce78d32" path="/var/lib/kubelet/pods/6c8289d1-5df7-4e68-b0b3-ea797ce78d32/volumes" Mar 13 01:46:07.795073 master-0 kubenswrapper[19170]: I0313 01:46:07.793943 19170 generic.go:334] "Generic (PLEG): container finished" podID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" containerID="8fb4ea3c62ecb45c586bedb078fb9dbcd4289bb78ac351f9df89e9e1d574da1a" exitCode=1 Mar 13 01:46:07.795073 master-0 kubenswrapper[19170]: I0313 01:46:07.794063 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerDied","Data":"8fb4ea3c62ecb45c586bedb078fb9dbcd4289bb78ac351f9df89e9e1d574da1a"} Mar 13 01:46:07.795073 master-0 kubenswrapper[19170]: I0313 01:46:07.794732 19170 scope.go:117] "RemoveContainer" containerID="8fb4ea3c62ecb45c586bedb078fb9dbcd4289bb78ac351f9df89e9e1d574da1a" Mar 13 01:46:07.823510 master-0 kubenswrapper[19170]: I0313 01:46:07.817557 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerStarted","Data":"ae5da2d722ff861c6d25510514887033d97fa5b85ad1f2332b2df4d2991191b9"} Mar 13 01:46:07.858655 master-0 kubenswrapper[19170]: I0313 01:46:07.842013 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"7c04c955-8b3a-4d71-9889-066b1a0732a6","Type":"ContainerStarted","Data":"77c65d9260c469cd2f68df7f32fdd84f688c27698f9d541cb56b112d0bf0f0e9"} Mar 13 01:46:07.858655 master-0 kubenswrapper[19170]: I0313 01:46:07.845141 19170 generic.go:334] "Generic (PLEG): container finished" podID="3785df35-68b7-4d28-8b4a-39c3136ce823" containerID="2d5fe2df526094d9f0bc0ebb04ba4aabffe71f62f8bab3158203be71c1b3976f" exitCode=0 Mar 13 01:46:07.858655 master-0 kubenswrapper[19170]: I0313 01:46:07.845931 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerDied","Data":"2d5fe2df526094d9f0bc0ebb04ba4aabffe71f62f8bab3158203be71c1b3976f"} Mar 13 01:46:07.956766 master-0 kubenswrapper[19170]: I0313 01:46:07.956639 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:07.956766 master-0 kubenswrapper[19170]: I0313 01:46:07.956697 19170 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:08.594709 master-0 kubenswrapper[19170]: I0313 01:46:08.593895 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:46:08.767170 master-0 kubenswrapper[19170]: I0313 01:46:08.766599 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:46:08.767170 master-0 kubenswrapper[19170]: I0313 01:46:08.766845 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="dnsmasq-dns" containerID="cri-o://4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603" gracePeriod=10 Mar 13 01:46:08.865473 master-0 kubenswrapper[19170]: I0313 01:46:08.864041 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-051b7-api-0" Mar 13 01:46:08.946656 master-0 kubenswrapper[19170]: I0313 01:46:08.942677 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerStarted","Data":"a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991"} Mar 13 01:46:08.946656 master-0 kubenswrapper[19170]: I0313 01:46:08.943983 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:08.974672 master-0 kubenswrapper[19170]: I0313 01:46:08.966990 19170 generic.go:334] "Generic (PLEG): container finished" podID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerID="1fc5e2b7358b8d0fe8e7be99d741679726c0f182ffb45cc871d863e78bb9d031" exitCode=1 Mar 13 01:46:08.974672 master-0 kubenswrapper[19170]: I0313 01:46:08.967066 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"1fc5e2b7358b8d0fe8e7be99d741679726c0f182ffb45cc871d863e78bb9d031"} Mar 13 01:46:08.974672 master-0 kubenswrapper[19170]: I0313 01:46:08.967886 19170 scope.go:117] "RemoveContainer" containerID="1fc5e2b7358b8d0fe8e7be99d741679726c0f182ffb45cc871d863e78bb9d031" Mar 13 01:46:09.020038 master-0 kubenswrapper[19170]: I0313 01:46:09.019998 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"7c04c955-8b3a-4d71-9889-066b1a0732a6","Type":"ContainerStarted","Data":"f31328dc8c4fce97b8bffb6b9145412276d8c9b853d6915b229ac0f5638aaa31"} Mar 13 01:46:09.021556 master-0 kubenswrapper[19170]: I0313 01:46:09.021538 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-051b7-backup-0" event={"ID":"7c04c955-8b3a-4d71-9889-066b1a0732a6","Type":"ContainerStarted","Data":"a11c57d6c65ae136397ddde0415db35d61b58f253a9bf6ce609ba1c7a5a405c3"} Mar 13 01:46:09.140034 master-0 kubenswrapper[19170]: I0313 01:46:09.139837 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-051b7-backup-0" podStartSLOduration=4.139819733 podStartE2EDuration="4.139819733s" podCreationTimestamp="2026-03-13 01:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:09.076512158 +0000 UTC m=+1629.884633118" watchObservedRunningTime="2026-03-13 01:46:09.139819733 +0000 UTC m=+1629.947940693" Mar 13 01:46:09.273392 master-0 kubenswrapper[19170]: I0313 01:46:09.270003 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bb5b98c8c-485mq" Mar 13 01:46:09.400316 master-0 kubenswrapper[19170]: I0313 01:46:09.399127 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:09.893126 master-0 kubenswrapper[19170]: I0313 01:46:09.893085 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:46:10.076128 master-0 kubenswrapper[19170]: I0313 01:46:10.076081 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.076269 master-0 kubenswrapper[19170]: I0313 01:46:10.076255 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.076334 master-0 kubenswrapper[19170]: I0313 01:46:10.076284 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.076449 master-0 kubenswrapper[19170]: I0313 01:46:10.076428 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.076511 master-0 kubenswrapper[19170]: I0313 01:46:10.076453 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d45br\" (UniqueName: \"kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.076511 master-0 kubenswrapper[19170]: I0313 01:46:10.076488 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config\") pod \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\" (UID: \"9a89c46c-68a8-4baa-b926-3cbd5b85161c\") " Mar 13 01:46:10.133658 master-0 kubenswrapper[19170]: I0313 01:46:10.132489 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerStarted","Data":"125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403"} Mar 13 01:46:10.140685 master-0 kubenswrapper[19170]: I0313 01:46:10.133923 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:46:10.161237 master-0 kubenswrapper[19170]: I0313 01:46:10.150358 19170 generic.go:334] "Generic (PLEG): container finished" podID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerID="4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603" exitCode=0 Mar 13 01:46:10.161237 master-0 kubenswrapper[19170]: I0313 01:46:10.151558 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" Mar 13 01:46:10.161237 master-0 kubenswrapper[19170]: I0313 01:46:10.151757 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" event={"ID":"9a89c46c-68a8-4baa-b926-3cbd5b85161c","Type":"ContainerDied","Data":"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603"} Mar 13 01:46:10.161237 master-0 kubenswrapper[19170]: I0313 01:46:10.151784 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c5578c7c-7p22w" event={"ID":"9a89c46c-68a8-4baa-b926-3cbd5b85161c","Type":"ContainerDied","Data":"b54403c6cc2ccfad0abafdd9af5764fbac9e86f86dc37927a2f50a2936a8ca54"} Mar 13 01:46:10.161237 master-0 kubenswrapper[19170]: I0313 01:46:10.151799 19170 scope.go:117] "RemoveContainer" containerID="4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603" Mar 13 01:46:10.165109 master-0 kubenswrapper[19170]: I0313 01:46:10.164842 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-86cc47d7f5-mk8sb" podStartSLOduration=9.986101405 podStartE2EDuration="13.164816145s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="2026-03-13 01:45:59.759387743 +0000 UTC m=+1620.567508703" lastFinishedPulling="2026-03-13 01:46:02.938102483 +0000 UTC m=+1623.746223443" observedRunningTime="2026-03-13 01:46:10.157355395 +0000 UTC m=+1630.965476355" watchObservedRunningTime="2026-03-13 01:46:10.164816145 +0000 UTC m=+1630.972937105" Mar 13 01:46:10.207863 master-0 kubenswrapper[19170]: I0313 01:46:10.207787 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br" (OuterVolumeSpecName: "kube-api-access-d45br") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "kube-api-access-d45br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:10.243482 master-0 kubenswrapper[19170]: I0313 01:46:10.243419 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config" (OuterVolumeSpecName: "config") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:10.285011 master-0 kubenswrapper[19170]: I0313 01:46:10.284460 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d45br\" (UniqueName: \"kubernetes.io/projected/9a89c46c-68a8-4baa-b926-3cbd5b85161c-kube-api-access-d45br\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.285011 master-0 kubenswrapper[19170]: I0313 01:46:10.284495 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.296293 master-0 kubenswrapper[19170]: I0313 01:46:10.296216 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:10.299803 master-0 kubenswrapper[19170]: I0313 01:46:10.299054 19170 scope.go:117] "RemoveContainer" containerID="03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872" Mar 13 01:46:10.321240 master-0 kubenswrapper[19170]: I0313 01:46:10.321156 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:10.341710 master-0 kubenswrapper[19170]: I0313 01:46:10.339877 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:10.342870 master-0 kubenswrapper[19170]: I0313 01:46:10.342812 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a89c46c-68a8-4baa-b926-3cbd5b85161c" (UID: "9a89c46c-68a8-4baa-b926-3cbd5b85161c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: I0313 01:46:10.361859 19170 scope.go:117] "RemoveContainer" containerID="4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: E0313 01:46:10.362253 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603\": container with ID starting with 4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603 not found: ID does not exist" containerID="4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: I0313 01:46:10.362278 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603"} err="failed to get container status \"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603\": rpc error: code = NotFound desc = could not find container \"4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603\": container with ID starting with 4670c669e9fc6d1cbd60890090cca9047749c7b1b6b564a16bc19e770fad5603 not found: ID does not exist" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: I0313 01:46:10.362300 19170 scope.go:117] "RemoveContainer" containerID="03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: E0313 01:46:10.362482 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872\": container with ID starting with 03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872 not found: ID does not exist" containerID="03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872" Mar 13 01:46:10.364664 master-0 kubenswrapper[19170]: I0313 01:46:10.362498 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872"} err="failed to get container status \"03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872\": rpc error: code = NotFound desc = could not find container \"03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872\": container with ID starting with 03565ffe5a54813f38103fbd3d31b4a3d82f022c36b06782a4fa869a37775872 not found: ID does not exist" Mar 13 01:46:10.387660 master-0 kubenswrapper[19170]: I0313 01:46:10.386769 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.387660 master-0 kubenswrapper[19170]: I0313 01:46:10.386815 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.387660 master-0 kubenswrapper[19170]: I0313 01:46:10.386827 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.387660 master-0 kubenswrapper[19170]: I0313 01:46:10.386839 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a89c46c-68a8-4baa-b926-3cbd5b85161c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:10.505547 master-0 kubenswrapper[19170]: I0313 01:46:10.504376 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:46:10.519740 master-0 kubenswrapper[19170]: I0313 01:46:10.518812 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c5578c7c-7p22w"] Mar 13 01:46:11.174532 master-0 kubenswrapper[19170]: I0313 01:46:11.174011 19170 generic.go:334] "Generic (PLEG): container finished" podID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerID="125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403" exitCode=1 Mar 13 01:46:11.175609 master-0 kubenswrapper[19170]: I0313 01:46:11.175572 19170 scope.go:117] "RemoveContainer" containerID="125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403" Mar 13 01:46:11.175768 master-0 kubenswrapper[19170]: I0313 01:46:11.175714 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403"} Mar 13 01:46:11.175817 master-0 kubenswrapper[19170]: I0313 01:46:11.175779 19170 scope.go:117] "RemoveContainer" containerID="1fc5e2b7358b8d0fe8e7be99d741679726c0f182ffb45cc871d863e78bb9d031" Mar 13 01:46:11.175924 master-0 kubenswrapper[19170]: E0313 01:46:11.175897 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-86cc47d7f5-mk8sb_openstack(bcadd2d8-9133-4e14-af9d-1c42095f91f5)\"" pod="openstack/ironic-86cc47d7f5-mk8sb" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" Mar 13 01:46:11.226365 master-0 kubenswrapper[19170]: I0313 01:46:11.224720 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:11.297389 master-0 kubenswrapper[19170]: I0313 01:46:11.293303 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:11.442480 master-0 kubenswrapper[19170]: I0313 01:46:11.442431 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" path="/var/lib/kubelet/pods/9a89c46c-68a8-4baa-b926-3cbd5b85161c/volumes" Mar 13 01:46:11.662187 master-0 kubenswrapper[19170]: I0313 01:46:11.660524 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-556665446f-zdfzl" Mar 13 01:46:11.805551 master-0 kubenswrapper[19170]: I0313 01:46:11.805463 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:46:12.215316 master-0 kubenswrapper[19170]: I0313 01:46:12.212034 19170 scope.go:117] "RemoveContainer" containerID="125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403" Mar 13 01:46:12.215316 master-0 kubenswrapper[19170]: E0313 01:46:12.212267 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-86cc47d7f5-mk8sb_openstack(bcadd2d8-9133-4e14-af9d-1c42095f91f5)\"" pod="openstack/ironic-86cc47d7f5-mk8sb" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: I0313 01:46:12.377735 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: E0313 01:46:12.378326 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="init" Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: I0313 01:46:12.378340 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="init" Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: E0313 01:46:12.378384 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="dnsmasq-dns" Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: I0313 01:46:12.378391 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="dnsmasq-dns" Mar 13 01:46:12.378658 master-0 kubenswrapper[19170]: I0313 01:46:12.378649 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a89c46c-68a8-4baa-b926-3cbd5b85161c" containerName="dnsmasq-dns" Mar 13 01:46:12.383652 master-0 kubenswrapper[19170]: I0313 01:46:12.379346 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 01:46:12.383652 master-0 kubenswrapper[19170]: I0313 01:46:12.381706 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 01:46:12.387653 master-0 kubenswrapper[19170]: I0313 01:46:12.386688 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 01:46:12.398651 master-0 kubenswrapper[19170]: I0313 01:46:12.395526 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 01:46:12.484284 master-0 kubenswrapper[19170]: I0313 01:46:12.480032 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9qsv\" (UniqueName: \"kubernetes.io/projected/75b17b23-b02f-44af-96a0-6d0960662565-kube-api-access-m9qsv\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.484284 master-0 kubenswrapper[19170]: I0313 01:46:12.480330 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.484284 master-0 kubenswrapper[19170]: I0313 01:46:12.480462 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.484284 master-0 kubenswrapper[19170]: I0313 01:46:12.480504 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config-secret\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.484284 master-0 kubenswrapper[19170]: I0313 01:46:12.482411 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-scheduler-0" Mar 13 01:46:12.500650 master-0 kubenswrapper[19170]: I0313 01:46:12.500543 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-mkktw"] Mar 13 01:46:12.502398 master-0 kubenswrapper[19170]: I0313 01:46:12.502326 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.509732 master-0 kubenswrapper[19170]: I0313 01:46:12.509422 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 13 01:46:12.509824 master-0 kubenswrapper[19170]: I0313 01:46:12.509785 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 13 01:46:12.550663 master-0 kubenswrapper[19170]: I0313 01:46:12.544806 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-mkktw"] Mar 13 01:46:12.584783 master-0 kubenswrapper[19170]: I0313 01:46:12.584662 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.585032 master-0 kubenswrapper[19170]: I0313 01:46:12.585017 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.585145 master-0 kubenswrapper[19170]: I0313 01:46:12.585127 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.585284 master-0 kubenswrapper[19170]: I0313 01:46:12.585264 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.585501 master-0 kubenswrapper[19170]: I0313 01:46:12.585485 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8fl6\" (UniqueName: \"kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.585592 master-0 kubenswrapper[19170]: I0313 01:46:12.585580 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.585737 master-0 kubenswrapper[19170]: I0313 01:46:12.585720 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config-secret\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.585867 master-0 kubenswrapper[19170]: I0313 01:46:12.585849 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.585967 master-0 kubenswrapper[19170]: I0313 01:46:12.585952 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9qsv\" (UniqueName: \"kubernetes.io/projected/75b17b23-b02f-44af-96a0-6d0960662565-kube-api-access-m9qsv\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.586096 master-0 kubenswrapper[19170]: I0313 01:46:12.586076 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.586305 master-0 kubenswrapper[19170]: I0313 01:46:12.586286 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.592730 master-0 kubenswrapper[19170]: I0313 01:46:12.590701 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.597846 master-0 kubenswrapper[19170]: I0313 01:46:12.597819 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-combined-ca-bundle\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.608806 master-0 kubenswrapper[19170]: I0313 01:46:12.603612 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75b17b23-b02f-44af-96a0-6d0960662565-openstack-config-secret\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.608806 master-0 kubenswrapper[19170]: I0313 01:46:12.607102 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9qsv\" (UniqueName: \"kubernetes.io/projected/75b17b23-b02f-44af-96a0-6d0960662565-kube-api-access-m9qsv\") pod \"openstackclient\" (UID: \"75b17b23-b02f-44af-96a0-6d0960662565\") " pod="openstack/openstackclient" Mar 13 01:46:12.688294 master-0 kubenswrapper[19170]: I0313 01:46:12.688223 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8fl6\" (UniqueName: \"kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688474 master-0 kubenswrapper[19170]: I0313 01:46:12.688332 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688474 master-0 kubenswrapper[19170]: I0313 01:46:12.688366 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688474 master-0 kubenswrapper[19170]: I0313 01:46:12.688428 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688474 master-0 kubenswrapper[19170]: I0313 01:46:12.688457 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688604 master-0 kubenswrapper[19170]: I0313 01:46:12.688485 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.688604 master-0 kubenswrapper[19170]: I0313 01:46:12.688507 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.689006 master-0 kubenswrapper[19170]: I0313 01:46:12.688975 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.690151 master-0 kubenswrapper[19170]: I0313 01:46:12.690082 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.694050 master-0 kubenswrapper[19170]: I0313 01:46:12.694020 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.695882 master-0 kubenswrapper[19170]: I0313 01:46:12.695846 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.699199 master-0 kubenswrapper[19170]: I0313 01:46:12.699157 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.700913 master-0 kubenswrapper[19170]: I0313 01:46:12.700868 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.704913 master-0 kubenswrapper[19170]: I0313 01:46:12.704847 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8fl6\" (UniqueName: \"kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6\") pod \"ironic-inspector-db-sync-mkktw\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.754694 master-0 kubenswrapper[19170]: I0313 01:46:12.754601 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 01:46:12.890540 master-0 kubenswrapper[19170]: I0313 01:46:12.889690 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:12.968650 master-0 kubenswrapper[19170]: E0313 01:46:12.966192 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.968650 master-0 kubenswrapper[19170]: E0313 01:46:12.966523 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.968650 master-0 kubenswrapper[19170]: E0313 01:46:12.966751 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.968650 master-0 kubenswrapper[19170]: E0313 01:46:12.966780 19170 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" podUID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" containerName="ironic-neutron-agent" Mar 13 01:46:12.972241 master-0 kubenswrapper[19170]: E0313 01:46:12.971868 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.973352 master-0 kubenswrapper[19170]: E0313 01:46:12.972969 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.973352 master-0 kubenswrapper[19170]: E0313 01:46:12.973210 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" cmd=["/bin/true"] Mar 13 01:46:12.973352 master-0 kubenswrapper[19170]: E0313 01:46:12.973231 19170 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" podUID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" containerName="ironic-neutron-agent" Mar 13 01:46:13.251211 master-0 kubenswrapper[19170]: I0313 01:46:13.249575 19170 generic.go:334] "Generic (PLEG): container finished" podID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" exitCode=1 Mar 13 01:46:13.251211 master-0 kubenswrapper[19170]: I0313 01:46:13.250151 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-86cc47d7f5-mk8sb" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api-log" containerID="cri-o://ae5da2d722ff861c6d25510514887033d97fa5b85ad1f2332b2df4d2991191b9" gracePeriod=60 Mar 13 01:46:13.251211 master-0 kubenswrapper[19170]: I0313 01:46:13.250452 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerDied","Data":"a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991"} Mar 13 01:46:13.251211 master-0 kubenswrapper[19170]: I0313 01:46:13.250490 19170 scope.go:117] "RemoveContainer" containerID="8fb4ea3c62ecb45c586bedb078fb9dbcd4289bb78ac351f9df89e9e1d574da1a" Mar 13 01:46:13.252551 master-0 kubenswrapper[19170]: I0313 01:46:13.251462 19170 scope.go:117] "RemoveContainer" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" Mar 13 01:46:13.252551 master-0 kubenswrapper[19170]: E0313 01:46:13.251723 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-84f79fc9fc-7t7j5_openstack(e18f1449-3fb3-43a8-98cd-2f8713bba98d)\"" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" podUID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" Mar 13 01:46:13.360274 master-0 kubenswrapper[19170]: I0313 01:46:13.359961 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 01:46:13.525074 master-0 kubenswrapper[19170]: I0313 01:46:13.525016 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-mkktw"] Mar 13 01:46:14.659418 master-0 kubenswrapper[19170]: I0313 01:46:14.659314 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-volume-lvm-iscsi-0" Mar 13 01:46:15.219709 master-0 kubenswrapper[19170]: I0313 01:46:15.218236 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:46:15.219709 master-0 kubenswrapper[19170]: W0313 01:46:15.218780 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb89d6f8a_104e_40ec_bc5d_769918eea937.slice/crio-23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c WatchSource:0}: Error finding container 23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c: Status 404 returned error can't find the container with id 23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c Mar 13 01:46:15.361978 master-0 kubenswrapper[19170]: I0313 01:46:15.361939 19170 generic.go:334] "Generic (PLEG): container finished" podID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerID="ae5da2d722ff861c6d25510514887033d97fa5b85ad1f2332b2df4d2991191b9" exitCode=143 Mar 13 01:46:15.362110 master-0 kubenswrapper[19170]: I0313 01:46:15.362091 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"ae5da2d722ff861c6d25510514887033d97fa5b85ad1f2332b2df4d2991191b9"} Mar 13 01:46:15.498716 master-0 kubenswrapper[19170]: I0313 01:46:15.490330 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75b17b23-b02f-44af-96a0-6d0960662565","Type":"ContainerStarted","Data":"4069b8c7a17df40214fa49d58d99e0d413135d30b0006a7cecae4b53b6632a0e"} Mar 13 01:46:15.498716 master-0 kubenswrapper[19170]: I0313 01:46:15.490395 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-mkktw" event={"ID":"b89d6f8a-104e-40ec-bc5d-769918eea937","Type":"ContainerStarted","Data":"23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c"} Mar 13 01:46:15.910272 master-0 kubenswrapper[19170]: I0313 01:46:15.910169 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:46:16.036117 master-0 kubenswrapper[19170]: I0313 01:46:16.036032 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036117 master-0 kubenswrapper[19170]: I0313 01:46:16.036104 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036391 master-0 kubenswrapper[19170]: I0313 01:46:16.036255 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036391 master-0 kubenswrapper[19170]: I0313 01:46:16.036379 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036505 master-0 kubenswrapper[19170]: I0313 01:46:16.036475 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036881 master-0 kubenswrapper[19170]: I0313 01:46:16.036660 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036881 master-0 kubenswrapper[19170]: I0313 01:46:16.036686 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fxxr\" (UniqueName: \"kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.036881 master-0 kubenswrapper[19170]: I0313 01:46:16.036731 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs\") pod \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\" (UID: \"bcadd2d8-9133-4e14-af9d-1c42095f91f5\") " Mar 13 01:46:16.039495 master-0 kubenswrapper[19170]: I0313 01:46:16.037556 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs" (OuterVolumeSpecName: "logs") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:16.040613 master-0 kubenswrapper[19170]: I0313 01:46:16.040558 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:16.048406 master-0 kubenswrapper[19170]: I0313 01:46:16.048340 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:16.048794 master-0 kubenswrapper[19170]: I0313 01:46:16.048477 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 01:46:16.048794 master-0 kubenswrapper[19170]: I0313 01:46:16.048572 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts" (OuterVolumeSpecName: "scripts") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:16.093013 master-0 kubenswrapper[19170]: I0313 01:46:16.091826 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr" (OuterVolumeSpecName: "kube-api-access-8fxxr") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "kube-api-access-8fxxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143599 19170 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bcadd2d8-9133-4e14-af9d-1c42095f91f5-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143649 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143661 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fxxr\" (UniqueName: \"kubernetes.io/projected/bcadd2d8-9133-4e14-af9d-1c42095f91f5-kube-api-access-8fxxr\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143672 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143685 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.145125 master-0 kubenswrapper[19170]: I0313 01:46:16.143694 19170 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.149700 master-0 kubenswrapper[19170]: I0313 01:46:16.149298 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:16.158193 master-0 kubenswrapper[19170]: I0313 01:46:16.158123 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data" (OuterVolumeSpecName: "config-data") pod "bcadd2d8-9133-4e14-af9d-1c42095f91f5" (UID: "bcadd2d8-9133-4e14-af9d-1c42095f91f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:16.246383 master-0 kubenswrapper[19170]: I0313 01:46:16.246323 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.246383 master-0 kubenswrapper[19170]: I0313 01:46:16.246373 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcadd2d8-9133-4e14-af9d-1c42095f91f5-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:16.582453 master-0 kubenswrapper[19170]: I0313 01:46:16.582385 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-86cc47d7f5-mk8sb" event={"ID":"bcadd2d8-9133-4e14-af9d-1c42095f91f5","Type":"ContainerDied","Data":"4865a89b78ff8fbceda9b5950940d733b12f5cd684365b375f8d75f88f1e0833"} Mar 13 01:46:16.582453 master-0 kubenswrapper[19170]: I0313 01:46:16.582448 19170 scope.go:117] "RemoveContainer" containerID="125e3dd910cda4da2f92e475626fd3f40a1c069a390937a5a9ff087e9f529403" Mar 13 01:46:16.582823 master-0 kubenswrapper[19170]: I0313 01:46:16.582593 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-86cc47d7f5-mk8sb" Mar 13 01:46:16.621032 master-0 kubenswrapper[19170]: I0313 01:46:16.620797 19170 scope.go:117] "RemoveContainer" containerID="ae5da2d722ff861c6d25510514887033d97fa5b85ad1f2332b2df4d2991191b9" Mar 13 01:46:16.652553 master-0 kubenswrapper[19170]: I0313 01:46:16.650452 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-051b7-backup-0" Mar 13 01:46:16.660248 master-0 kubenswrapper[19170]: I0313 01:46:16.660084 19170 scope.go:117] "RemoveContainer" containerID="95e0b854bbd0ee6b426a81ebb0f5c3c0f0d159f00119b8a6ede5895cf4c8ab94" Mar 13 01:46:16.677920 master-0 kubenswrapper[19170]: I0313 01:46:16.677874 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:46:16.688969 master-0 kubenswrapper[19170]: I0313 01:46:16.688908 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-86cc47d7f5-mk8sb"] Mar 13 01:46:17.432989 master-0 kubenswrapper[19170]: I0313 01:46:17.432948 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" path="/var/lib/kubelet/pods/bcadd2d8-9133-4e14-af9d-1c42095f91f5/volumes" Mar 13 01:46:17.835774 master-0 kubenswrapper[19170]: I0313 01:46:17.835712 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-79fb9b7bb8-phv6c"] Mar 13 01:46:17.836321 master-0 kubenswrapper[19170]: E0313 01:46:17.836300 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api-log" Mar 13 01:46:17.836321 master-0 kubenswrapper[19170]: I0313 01:46:17.836318 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api-log" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: E0313 01:46:17.836345 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: I0313 01:46:17.836354 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: E0313 01:46:17.836366 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="init" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: I0313 01:46:17.836373 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="init" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: E0313 01:46:17.836388 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="init" Mar 13 01:46:17.836431 master-0 kubenswrapper[19170]: I0313 01:46:17.836394 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="init" Mar 13 01:46:17.836685 master-0 kubenswrapper[19170]: I0313 01:46:17.836676 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.836724 master-0 kubenswrapper[19170]: I0313 01:46:17.836688 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api-log" Mar 13 01:46:17.838070 master-0 kubenswrapper[19170]: E0313 01:46:17.836922 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.838070 master-0 kubenswrapper[19170]: I0313 01:46:17.836935 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.838070 master-0 kubenswrapper[19170]: I0313 01:46:17.837148 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcadd2d8-9133-4e14-af9d-1c42095f91f5" containerName="ironic-api" Mar 13 01:46:17.838070 master-0 kubenswrapper[19170]: I0313 01:46:17.837897 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.842380 master-0 kubenswrapper[19170]: I0313 01:46:17.842338 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 01:46:17.842512 master-0 kubenswrapper[19170]: I0313 01:46:17.842493 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 01:46:17.842618 master-0 kubenswrapper[19170]: I0313 01:46:17.842601 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859218 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-public-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859358 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-run-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859419 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjb9x\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-kube-api-access-cjb9x\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859560 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-internal-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859620 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-config-data\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859741 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-log-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859783 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-combined-ca-bundle\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.869796 master-0 kubenswrapper[19170]: I0313 01:46:17.859871 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-etc-swift\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.884733 master-0 kubenswrapper[19170]: I0313 01:46:17.871843 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79fb9b7bb8-phv6c"] Mar 13 01:46:17.956763 master-0 kubenswrapper[19170]: I0313 01:46:17.956721 19170 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:17.956926 master-0 kubenswrapper[19170]: I0313 01:46:17.956784 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:17.957975 master-0 kubenswrapper[19170]: I0313 01:46:17.957954 19170 scope.go:117] "RemoveContainer" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" Mar 13 01:46:17.958390 master-0 kubenswrapper[19170]: E0313 01:46:17.958362 19170 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-84f79fc9fc-7t7j5_openstack(e18f1449-3fb3-43a8-98cd-2f8713bba98d)\"" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" podUID="e18f1449-3fb3-43a8-98cd-2f8713bba98d" Mar 13 01:46:17.962821 master-0 kubenswrapper[19170]: I0313 01:46:17.962786 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-internal-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.962965 master-0 kubenswrapper[19170]: I0313 01:46:17.962951 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-config-data\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.963124 master-0 kubenswrapper[19170]: I0313 01:46:17.963110 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-log-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.963501 master-0 kubenswrapper[19170]: I0313 01:46:17.963451 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-combined-ca-bundle\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.963762 master-0 kubenswrapper[19170]: I0313 01:46:17.963740 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-etc-swift\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.963988 master-0 kubenswrapper[19170]: I0313 01:46:17.963965 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-public-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.964080 master-0 kubenswrapper[19170]: I0313 01:46:17.964061 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-run-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.964138 master-0 kubenswrapper[19170]: I0313 01:46:17.964120 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjb9x\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-kube-api-access-cjb9x\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.964303 master-0 kubenswrapper[19170]: I0313 01:46:17.964287 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-log-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.970648 master-0 kubenswrapper[19170]: I0313 01:46:17.966994 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-internal-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.980813 master-0 kubenswrapper[19170]: I0313 01:46:17.971647 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-public-tls-certs\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.980813 master-0 kubenswrapper[19170]: I0313 01:46:17.977013 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-combined-ca-bundle\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.980813 master-0 kubenswrapper[19170]: I0313 01:46:17.979411 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-etc-swift\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:17.986654 master-0 kubenswrapper[19170]: I0313 01:46:17.984989 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd97abab-893e-4fd6-a523-6a0f889dccbb-run-httpd\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:18.000962 master-0 kubenswrapper[19170]: I0313 01:46:17.997687 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd97abab-893e-4fd6-a523-6a0f889dccbb-config-data\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:18.012796 master-0 kubenswrapper[19170]: I0313 01:46:18.008813 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjb9x\" (UniqueName: \"kubernetes.io/projected/bd97abab-893e-4fd6-a523-6a0f889dccbb-kube-api-access-cjb9x\") pod \"swift-proxy-79fb9b7bb8-phv6c\" (UID: \"bd97abab-893e-4fd6-a523-6a0f889dccbb\") " pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:18.175245 master-0 kubenswrapper[19170]: I0313 01:46:18.174777 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:18.621116 master-0 kubenswrapper[19170]: I0313 01:46:18.621059 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-mkktw" event={"ID":"b89d6f8a-104e-40ec-bc5d-769918eea937","Type":"ContainerStarted","Data":"f92375d029034361b9fb2a02e2fc6c02f013d38d7084e9604725550c1a4635cc"} Mar 13 01:46:18.832653 master-0 kubenswrapper[19170]: W0313 01:46:18.826994 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd97abab_893e_4fd6_a523_6a0f889dccbb.slice/crio-c2e16b3b8a5a820a5d91384a69dc715473d1e479e1033c14b697cbb39e30da1e WatchSource:0}: Error finding container c2e16b3b8a5a820a5d91384a69dc715473d1e479e1033c14b697cbb39e30da1e: Status 404 returned error can't find the container with id c2e16b3b8a5a820a5d91384a69dc715473d1e479e1033c14b697cbb39e30da1e Mar 13 01:46:18.841417 master-0 kubenswrapper[19170]: I0313 01:46:18.840935 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79fb9b7bb8-phv6c"] Mar 13 01:46:18.860091 master-0 kubenswrapper[19170]: I0313 01:46:18.859992 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-mkktw" podStartSLOduration=4.096314048 podStartE2EDuration="6.859964329s" podCreationTimestamp="2026-03-13 01:46:12 +0000 UTC" firstStartedPulling="2026-03-13 01:46:15.278059705 +0000 UTC m=+1636.086180665" lastFinishedPulling="2026-03-13 01:46:18.041709986 +0000 UTC m=+1638.849830946" observedRunningTime="2026-03-13 01:46:18.8057205 +0000 UTC m=+1639.613841470" watchObservedRunningTime="2026-03-13 01:46:18.859964329 +0000 UTC m=+1639.668085289" Mar 13 01:46:19.644014 master-0 kubenswrapper[19170]: I0313 01:46:19.640777 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" event={"ID":"bd97abab-893e-4fd6-a523-6a0f889dccbb","Type":"ContainerStarted","Data":"6e76a6c50054f4065c5d864b317d61561395ee5f5ca891ac55ed9c43932d45bc"} Mar 13 01:46:19.644014 master-0 kubenswrapper[19170]: I0313 01:46:19.640856 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" event={"ID":"bd97abab-893e-4fd6-a523-6a0f889dccbb","Type":"ContainerStarted","Data":"16e491ed620b70f4fecaf3206275be1eccd4dd69c006e30d9295d18e86e71253"} Mar 13 01:46:19.644014 master-0 kubenswrapper[19170]: I0313 01:46:19.640871 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" event={"ID":"bd97abab-893e-4fd6-a523-6a0f889dccbb","Type":"ContainerStarted","Data":"c2e16b3b8a5a820a5d91384a69dc715473d1e479e1033c14b697cbb39e30da1e"} Mar 13 01:46:19.644014 master-0 kubenswrapper[19170]: I0313 01:46:19.641309 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:19.644014 master-0 kubenswrapper[19170]: I0313 01:46:19.641426 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:20.031581 master-0 kubenswrapper[19170]: I0313 01:46:20.031500 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" podStartSLOduration=3.031480702 podStartE2EDuration="3.031480702s" podCreationTimestamp="2026-03-13 01:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:20.013006001 +0000 UTC m=+1640.821126961" watchObservedRunningTime="2026-03-13 01:46:20.031480702 +0000 UTC m=+1640.839601662" Mar 13 01:46:20.684649 master-0 kubenswrapper[19170]: I0313 01:46:20.681089 19170 generic.go:334] "Generic (PLEG): container finished" podID="b89d6f8a-104e-40ec-bc5d-769918eea937" containerID="f92375d029034361b9fb2a02e2fc6c02f013d38d7084e9604725550c1a4635cc" exitCode=0 Mar 13 01:46:20.684649 master-0 kubenswrapper[19170]: I0313 01:46:20.682202 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-mkktw" event={"ID":"b89d6f8a-104e-40ec-bc5d-769918eea937","Type":"ContainerDied","Data":"f92375d029034361b9fb2a02e2fc6c02f013d38d7084e9604725550c1a4635cc"} Mar 13 01:46:22.455713 master-0 kubenswrapper[19170]: I0313 01:46:22.455566 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59fb7b67df-dqvjc" Mar 13 01:46:22.618256 master-0 kubenswrapper[19170]: I0313 01:46:22.618127 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:46:22.618554 master-0 kubenswrapper[19170]: I0313 01:46:22.618514 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5df95cbc76-9kdvv" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-api" containerID="cri-o://35334b7bb9a526cd0fec9834e9c7d8199fbdf00ef6f89d2ad12408e67b450d4b" gracePeriod=30 Mar 13 01:46:22.619160 master-0 kubenswrapper[19170]: I0313 01:46:22.619113 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5df95cbc76-9kdvv" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-httpd" containerID="cri-o://44a94b74db9eb44c8ae267432b09deff59745638400632437a0c42658756ff5c" gracePeriod=30 Mar 13 01:46:24.248670 master-0 kubenswrapper[19170]: I0313 01:46:24.246764 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:24.248670 master-0 kubenswrapper[19170]: I0313 01:46:24.247032 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-external-api-0" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-log" containerID="cri-o://5e01aa9d899d75b739283495ae642b747304de2ceb477f26cbcca8b0085e0c92" gracePeriod=30 Mar 13 01:46:24.248670 master-0 kubenswrapper[19170]: I0313 01:46:24.247560 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-external-api-0" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-httpd" containerID="cri-o://0c2095656099b3b45238a1bf4e00a44679bb9446b938e58a90f59baf2f39dfba" gracePeriod=30 Mar 13 01:46:25.994784 master-0 kubenswrapper[19170]: I0313 01:46:25.994741 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:25.995624 master-0 kubenswrapper[19170]: I0313 01:46:25.995598 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-internal-api-0" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-log" containerID="cri-o://c632611af070ea18c1e344e383398eb132439553e422bc44171a9cf4e56b8ec1" gracePeriod=30 Mar 13 01:46:25.999934 master-0 kubenswrapper[19170]: I0313 01:46:25.996043 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b9844-default-internal-api-0" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-httpd" containerID="cri-o://576b62567482197d09934570c36b691fedbd52b01043c1e3d7665d2eada20ed5" gracePeriod=30 Mar 13 01:46:26.034589 master-0 kubenswrapper[19170]: I0313 01:46:26.034213 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:46:26.064325 master-0 kubenswrapper[19170]: I0313 01:46:26.058214 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d877f97b-4xkgq" Mar 13 01:46:26.193243 master-0 kubenswrapper[19170]: I0313 01:46:26.193174 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:46:26.194279 master-0 kubenswrapper[19170]: I0313 01:46:26.193561 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-769f4cdbc8-8mz4m" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-log" containerID="cri-o://eada95478f376b51f72504d808ebcc6bb95b656dd0f9a7257ed10e59c9d8f2f5" gracePeriod=30 Mar 13 01:46:26.194279 master-0 kubenswrapper[19170]: I0313 01:46:26.193751 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-769f4cdbc8-8mz4m" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-api" containerID="cri-o://c5e6417ef0054652c8fb4b74c37596118e7448530a224c9701dcdb46b0eaaeb8" gracePeriod=30 Mar 13 01:46:28.179778 master-0 kubenswrapper[19170]: I0313 01:46:28.179750 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:28.182551 master-0 kubenswrapper[19170]: I0313 01:46:28.182346 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79fb9b7bb8-phv6c" Mar 13 01:46:28.326742 master-0 kubenswrapper[19170]: I0313 01:46:28.326690 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.413659 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.413713 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.413795 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.413868 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8fl6\" (UniqueName: \"kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.413930 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.414104 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.414460 master-0 kubenswrapper[19170]: I0313 01:46:28.414130 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic\") pod \"b89d6f8a-104e-40ec-bc5d-769918eea937\" (UID: \"b89d6f8a-104e-40ec-bc5d-769918eea937\") " Mar 13 01:46:28.415886 master-0 kubenswrapper[19170]: I0313 01:46:28.415769 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:28.418800 master-0 kubenswrapper[19170]: I0313 01:46:28.418661 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:28.422357 master-0 kubenswrapper[19170]: I0313 01:46:28.422027 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 01:46:28.422357 master-0 kubenswrapper[19170]: I0313 01:46:28.422268 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6" (OuterVolumeSpecName: "kube-api-access-t8fl6") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "kube-api-access-t8fl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:28.438119 master-0 kubenswrapper[19170]: I0313 01:46:28.435895 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts" (OuterVolumeSpecName: "scripts") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:28.456706 master-0 kubenswrapper[19170]: I0313 01:46:28.455607 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:28.479800 master-0 kubenswrapper[19170]: I0313 01:46:28.479743 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config" (OuterVolumeSpecName: "config") pod "b89d6f8a-104e-40ec-bc5d-769918eea937" (UID: "b89d6f8a-104e-40ec-bc5d-769918eea937"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517916 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517954 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8fl6\" (UniqueName: \"kubernetes.io/projected/b89d6f8a-104e-40ec-bc5d-769918eea937-kube-api-access-t8fl6\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517966 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517977 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517988 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b89d6f8a-104e-40ec-bc5d-769918eea937-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.517997 19170 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b89d6f8a-104e-40ec-bc5d-769918eea937-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.518374 master-0 kubenswrapper[19170]: I0313 01:46:28.518008 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b89d6f8a-104e-40ec-bc5d-769918eea937-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:28.825676 master-0 kubenswrapper[19170]: I0313 01:46:28.824649 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-mkktw" Mar 13 01:46:28.825676 master-0 kubenswrapper[19170]: I0313 01:46:28.824609 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-mkktw" event={"ID":"b89d6f8a-104e-40ec-bc5d-769918eea937","Type":"ContainerDied","Data":"23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c"} Mar 13 01:46:28.825676 master-0 kubenswrapper[19170]: I0313 01:46:28.824808 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23bac7f8871101c4544cdbd10784a5508c3182b7ef4da441251b52e5f903646c" Mar 13 01:46:28.827515 master-0 kubenswrapper[19170]: I0313 01:46:28.827469 19170 generic.go:334] "Generic (PLEG): container finished" podID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerID="0c2095656099b3b45238a1bf4e00a44679bb9446b938e58a90f59baf2f39dfba" exitCode=0 Mar 13 01:46:28.827515 master-0 kubenswrapper[19170]: I0313 01:46:28.827507 19170 generic.go:334] "Generic (PLEG): container finished" podID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerID="5e01aa9d899d75b739283495ae642b747304de2ceb477f26cbcca8b0085e0c92" exitCode=143 Mar 13 01:46:28.827616 master-0 kubenswrapper[19170]: I0313 01:46:28.827546 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerDied","Data":"0c2095656099b3b45238a1bf4e00a44679bb9446b938e58a90f59baf2f39dfba"} Mar 13 01:46:28.827616 master-0 kubenswrapper[19170]: I0313 01:46:28.827575 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerDied","Data":"5e01aa9d899d75b739283495ae642b747304de2ceb477f26cbcca8b0085e0c92"} Mar 13 01:46:28.833111 master-0 kubenswrapper[19170]: I0313 01:46:28.833078 19170 generic.go:334] "Generic (PLEG): container finished" podID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerID="eada95478f376b51f72504d808ebcc6bb95b656dd0f9a7257ed10e59c9d8f2f5" exitCode=143 Mar 13 01:46:28.833210 master-0 kubenswrapper[19170]: I0313 01:46:28.833164 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerDied","Data":"eada95478f376b51f72504d808ebcc6bb95b656dd0f9a7257ed10e59c9d8f2f5"} Mar 13 01:46:28.836330 master-0 kubenswrapper[19170]: I0313 01:46:28.836275 19170 generic.go:334] "Generic (PLEG): container finished" podID="28caadd7-b003-4cb7-83d7-508a641ad738" containerID="44a94b74db9eb44c8ae267432b09deff59745638400632437a0c42658756ff5c" exitCode=0 Mar 13 01:46:28.836330 master-0 kubenswrapper[19170]: I0313 01:46:28.836300 19170 generic.go:334] "Generic (PLEG): container finished" podID="28caadd7-b003-4cb7-83d7-508a641ad738" containerID="35334b7bb9a526cd0fec9834e9c7d8199fbdf00ef6f89d2ad12408e67b450d4b" exitCode=0 Mar 13 01:46:28.836433 master-0 kubenswrapper[19170]: I0313 01:46:28.836335 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerDied","Data":"44a94b74db9eb44c8ae267432b09deff59745638400632437a0c42658756ff5c"} Mar 13 01:46:28.836433 master-0 kubenswrapper[19170]: I0313 01:46:28.836367 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerDied","Data":"35334b7bb9a526cd0fec9834e9c7d8199fbdf00ef6f89d2ad12408e67b450d4b"} Mar 13 01:46:28.839386 master-0 kubenswrapper[19170]: I0313 01:46:28.839359 19170 generic.go:334] "Generic (PLEG): container finished" podID="af162e22-de53-4f80-a7a9-877bda3e9740" containerID="c632611af070ea18c1e344e383398eb132439553e422bc44171a9cf4e56b8ec1" exitCode=143 Mar 13 01:46:28.840449 master-0 kubenswrapper[19170]: I0313 01:46:28.840424 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerDied","Data":"c632611af070ea18c1e344e383398eb132439553e422bc44171a9cf4e56b8ec1"} Mar 13 01:46:29.439955 master-0 kubenswrapper[19170]: I0313 01:46:29.439905 19170 scope.go:117] "RemoveContainer" containerID="a7ad73cd94faf0b9a1c94e6e2e18b473b70290d8191812e3554dd658df9e1991" Mar 13 01:46:30.967340 master-0 kubenswrapper[19170]: I0313 01:46:30.963358 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:46:30.967340 master-0 kubenswrapper[19170]: E0313 01:46:30.964206 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89d6f8a-104e-40ec-bc5d-769918eea937" containerName="ironic-inspector-db-sync" Mar 13 01:46:30.967340 master-0 kubenswrapper[19170]: I0313 01:46:30.964222 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89d6f8a-104e-40ec-bc5d-769918eea937" containerName="ironic-inspector-db-sync" Mar 13 01:46:30.967340 master-0 kubenswrapper[19170]: I0313 01:46:30.964534 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89d6f8a-104e-40ec-bc5d-769918eea937" containerName="ironic-inspector-db-sync" Mar 13 01:46:30.969958 master-0 kubenswrapper[19170]: I0313 01:46:30.969670 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:30.994415 master-0 kubenswrapper[19170]: I0313 01:46:30.994178 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:46:31.012270 master-0 kubenswrapper[19170]: I0313 01:46:31.010663 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.012270 master-0 kubenswrapper[19170]: I0313 01:46:31.010808 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.015459 master-0 kubenswrapper[19170]: I0313 01:46:31.013573 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.015459 master-0 kubenswrapper[19170]: I0313 01:46:31.013873 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9h24\" (UniqueName: \"kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.015459 master-0 kubenswrapper[19170]: I0313 01:46:31.014034 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.015459 master-0 kubenswrapper[19170]: I0313 01:46:31.014257 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.043756 master-0 kubenswrapper[19170]: I0313 01:46:31.043702 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:31.058060 master-0 kubenswrapper[19170]: I0313 01:46:31.058011 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:31.060832 master-0 kubenswrapper[19170]: I0313 01:46:31.060785 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 13 01:46:31.061321 master-0 kubenswrapper[19170]: I0313 01:46:31.061105 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 13 01:46:31.061666 master-0 kubenswrapper[19170]: I0313 01:46:31.061646 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 13 01:46:31.070095 master-0 kubenswrapper[19170]: I0313 01:46:31.070054 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:31.141590 master-0 kubenswrapper[19170]: I0313 01:46:31.141545 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.141839 master-0 kubenswrapper[19170]: I0313 01:46:31.141822 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd86r\" (UniqueName: \"kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.141924 master-0 kubenswrapper[19170]: I0313 01:46:31.141911 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.142000 master-0 kubenswrapper[19170]: I0313 01:46:31.141988 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.142073 master-0 kubenswrapper[19170]: I0313 01:46:31.142061 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.142147 master-0 kubenswrapper[19170]: I0313 01:46:31.142132 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.142246 master-0 kubenswrapper[19170]: I0313 01:46:31.142233 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9h24\" (UniqueName: \"kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.142329 master-0 kubenswrapper[19170]: I0313 01:46:31.142314 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.142409 master-0 kubenswrapper[19170]: I0313 01:46:31.142397 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.142485 master-0 kubenswrapper[19170]: I0313 01:46:31.142473 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.142589 master-0 kubenswrapper[19170]: I0313 01:46:31.142577 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.142749 master-0 kubenswrapper[19170]: I0313 01:46:31.142735 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.142823 master-0 kubenswrapper[19170]: I0313 01:46:31.142810 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.143724 master-0 kubenswrapper[19170]: I0313 01:46:31.143707 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.143992 master-0 kubenswrapper[19170]: I0313 01:46:31.143956 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.144734 master-0 kubenswrapper[19170]: I0313 01:46:31.144701 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.145399 master-0 kubenswrapper[19170]: I0313 01:46:31.145382 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.150663 master-0 kubenswrapper[19170]: I0313 01:46:31.150576 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.183684 master-0 kubenswrapper[19170]: I0313 01:46:31.182358 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9h24\" (UniqueName: \"kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24\") pod \"dnsmasq-dns-5cb659fff7-csvmp\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.245671 master-0 kubenswrapper[19170]: I0313 01:46:31.245493 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.245945 master-0 kubenswrapper[19170]: I0313 01:46:31.245921 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd86r\" (UniqueName: \"kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.246076 master-0 kubenswrapper[19170]: I0313 01:46:31.246056 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.246312 master-0 kubenswrapper[19170]: I0313 01:46:31.246290 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.246471 master-0 kubenswrapper[19170]: I0313 01:46:31.246420 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.247106 master-0 kubenswrapper[19170]: I0313 01:46:31.247059 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.247149 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.247808 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.247989 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.250421 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.252588 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.259326 master-0 kubenswrapper[19170]: I0313 01:46:31.257656 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.262764 master-0 kubenswrapper[19170]: I0313 01:46:31.262731 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.268475 master-0 kubenswrapper[19170]: I0313 01:46:31.266746 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd86r\" (UniqueName: \"kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r\") pod \"ironic-inspector-0\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:31.328119 master-0 kubenswrapper[19170]: I0313 01:46:31.328061 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:31.394451 master-0 kubenswrapper[19170]: I0313 01:46:31.394405 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:34.260359 master-0 kubenswrapper[19170]: I0313 01:46:34.260313 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:34.269939 master-0 kubenswrapper[19170]: I0313 01:46:34.269919 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:46:34.359867 master-0 kubenswrapper[19170]: I0313 01:46:34.357210 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.359867 master-0 kubenswrapper[19170]: I0313 01:46:34.357445 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.359867 master-0 kubenswrapper[19170]: I0313 01:46:34.359701 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:34.371983 master-0 kubenswrapper[19170]: I0313 01:46:34.371942 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.372072 master-0 kubenswrapper[19170]: I0313 01:46:34.372023 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cfhk\" (UniqueName: \"kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.372072 master-0 kubenswrapper[19170]: I0313 01:46:34.372055 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.372147 master-0 kubenswrapper[19170]: I0313 01:46:34.372082 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config\") pod \"28caadd7-b003-4cb7-83d7-508a641ad738\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " Mar 13 01:46:34.372147 master-0 kubenswrapper[19170]: I0313 01:46:34.372103 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.372209 master-0 kubenswrapper[19170]: I0313 01:46:34.372179 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74m6d\" (UniqueName: \"kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d\") pod \"28caadd7-b003-4cb7-83d7-508a641ad738\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372252 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372333 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs\") pod \"28caadd7-b003-4cb7-83d7-508a641ad738\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372354 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle\") pod \"28caadd7-b003-4cb7-83d7-508a641ad738\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372429 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts\") pod \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\" (UID: \"979e6bc4-2aa2-4326-b7f2-c45f50b41c28\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372480 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config\") pod \"28caadd7-b003-4cb7-83d7-508a641ad738\" (UID: \"28caadd7-b003-4cb7-83d7-508a641ad738\") " Mar 13 01:46:34.373451 master-0 kubenswrapper[19170]: I0313 01:46:34.372706 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs" (OuterVolumeSpecName: "logs") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:34.383101 master-0 kubenswrapper[19170]: I0313 01:46:34.380890 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.383101 master-0 kubenswrapper[19170]: I0313 01:46:34.380921 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.395647 master-0 kubenswrapper[19170]: I0313 01:46:34.393715 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "28caadd7-b003-4cb7-83d7-508a641ad738" (UID: "28caadd7-b003-4cb7-83d7-508a641ad738"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.402010 master-0 kubenswrapper[19170]: I0313 01:46:34.401896 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk" (OuterVolumeSpecName: "kube-api-access-4cfhk") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "kube-api-access-4cfhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:34.412928 master-0 kubenswrapper[19170]: I0313 01:46:34.403563 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts" (OuterVolumeSpecName: "scripts") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.431960 master-0 kubenswrapper[19170]: I0313 01:46:34.431902 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d" (OuterVolumeSpecName: "kube-api-access-74m6d") pod "28caadd7-b003-4cb7-83d7-508a641ad738" (UID: "28caadd7-b003-4cb7-83d7-508a641ad738"). InnerVolumeSpecName "kube-api-access-74m6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:34.440595 master-0 kubenswrapper[19170]: I0313 01:46:34.439568 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb" (OuterVolumeSpecName: "glance") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "pvc-60c39735-cd4e-4baf-8a95-3babad891e79". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:46:34.483784 master-0 kubenswrapper[19170]: I0313 01:46:34.483105 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.483784 master-0 kubenswrapper[19170]: I0313 01:46:34.483163 19170 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" " Mar 13 01:46:34.483784 master-0 kubenswrapper[19170]: I0313 01:46:34.483177 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cfhk\" (UniqueName: \"kubernetes.io/projected/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-kube-api-access-4cfhk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.483784 master-0 kubenswrapper[19170]: I0313 01:46:34.483191 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.483784 master-0 kubenswrapper[19170]: I0313 01:46:34.483201 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74m6d\" (UniqueName: \"kubernetes.io/projected/28caadd7-b003-4cb7-83d7-508a641ad738-kube-api-access-74m6d\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.582503 master-0 kubenswrapper[19170]: I0313 01:46:34.582465 19170 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 01:46:34.582735 master-0 kubenswrapper[19170]: I0313 01:46:34.582612 19170 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-60c39735-cd4e-4baf-8a95-3babad891e79" (UniqueName: "kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb") on node "master-0" Mar 13 01:46:34.585232 master-0 kubenswrapper[19170]: I0313 01:46:34.585194 19170 reconciler_common.go:293] "Volume detached for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.587786 master-0 kubenswrapper[19170]: I0313 01:46:34.587708 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.588200 master-0 kubenswrapper[19170]: I0313 01:46:34.588135 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.589574 master-0 kubenswrapper[19170]: I0313 01:46:34.589533 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28caadd7-b003-4cb7-83d7-508a641ad738" (UID: "28caadd7-b003-4cb7-83d7-508a641ad738"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.619890 master-0 kubenswrapper[19170]: I0313 01:46:34.619835 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data" (OuterVolumeSpecName: "config-data") pod "979e6bc4-2aa2-4326-b7f2-c45f50b41c28" (UID: "979e6bc4-2aa2-4326-b7f2-c45f50b41c28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.624824 master-0 kubenswrapper[19170]: I0313 01:46:34.623206 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config" (OuterVolumeSpecName: "config") pod "28caadd7-b003-4cb7-83d7-508a641ad738" (UID: "28caadd7-b003-4cb7-83d7-508a641ad738"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.657965 master-0 kubenswrapper[19170]: I0313 01:46:34.654071 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "28caadd7-b003-4cb7-83d7-508a641ad738" (UID: "28caadd7-b003-4cb7-83d7-508a641ad738"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:34.688973 master-0 kubenswrapper[19170]: I0313 01:46:34.688905 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.689190 master-0 kubenswrapper[19170]: I0313 01:46:34.688992 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.689190 master-0 kubenswrapper[19170]: I0313 01:46:34.689030 19170 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.689190 master-0 kubenswrapper[19170]: I0313 01:46:34.689043 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.689190 master-0 kubenswrapper[19170]: I0313 01:46:34.689083 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/28caadd7-b003-4cb7-83d7-508a641ad738-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.689190 master-0 kubenswrapper[19170]: I0313 01:46:34.689097 19170 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/979e6bc4-2aa2-4326-b7f2-c45f50b41c28-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:34.939697 master-0 kubenswrapper[19170]: I0313 01:46:34.935519 19170 generic.go:334] "Generic (PLEG): container finished" podID="af162e22-de53-4f80-a7a9-877bda3e9740" containerID="576b62567482197d09934570c36b691fedbd52b01043c1e3d7665d2eada20ed5" exitCode=0 Mar 13 01:46:34.939697 master-0 kubenswrapper[19170]: I0313 01:46:34.935604 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerDied","Data":"576b62567482197d09934570c36b691fedbd52b01043c1e3d7665d2eada20ed5"} Mar 13 01:46:34.939697 master-0 kubenswrapper[19170]: I0313 01:46:34.938779 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" event={"ID":"e18f1449-3fb3-43a8-98cd-2f8713bba98d","Type":"ContainerStarted","Data":"785df9d971ec307c40ad4f70ddb46d3513b6b5f203ee468f88f6ed2a7f33997b"} Mar 13 01:46:34.939697 master-0 kubenswrapper[19170]: I0313 01:46:34.939078 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:34.941747 master-0 kubenswrapper[19170]: I0313 01:46:34.940998 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"75b17b23-b02f-44af-96a0-6d0960662565","Type":"ContainerStarted","Data":"d0cb8134e4f68fa21356681f59a226646638e9516522cf2d5c5510a6b16df204"} Mar 13 01:46:34.945001 master-0 kubenswrapper[19170]: I0313 01:46:34.944949 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"979e6bc4-2aa2-4326-b7f2-c45f50b41c28","Type":"ContainerDied","Data":"e02e3404f42e9542570be268f76aa082a1bc70e35e1e9b025dfba6bbec1ce0f7"} Mar 13 01:46:34.945132 master-0 kubenswrapper[19170]: I0313 01:46:34.945009 19170 scope.go:117] "RemoveContainer" containerID="0c2095656099b3b45238a1bf4e00a44679bb9446b938e58a90f59baf2f39dfba" Mar 13 01:46:34.945557 master-0 kubenswrapper[19170]: I0313 01:46:34.945526 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:34.948254 master-0 kubenswrapper[19170]: I0313 01:46:34.948223 19170 generic.go:334] "Generic (PLEG): container finished" podID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerID="c5e6417ef0054652c8fb4b74c37596118e7448530a224c9701dcdb46b0eaaeb8" exitCode=0 Mar 13 01:46:34.948361 master-0 kubenswrapper[19170]: I0313 01:46:34.948314 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerDied","Data":"c5e6417ef0054652c8fb4b74c37596118e7448530a224c9701dcdb46b0eaaeb8"} Mar 13 01:46:34.951420 master-0 kubenswrapper[19170]: I0313 01:46:34.951290 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df95cbc76-9kdvv" event={"ID":"28caadd7-b003-4cb7-83d7-508a641ad738","Type":"ContainerDied","Data":"87572d1e75373b253b42c5ac1f13c35d9b32c2d624682ebc05df8a24aa9957f5"} Mar 13 01:46:34.951420 master-0 kubenswrapper[19170]: I0313 01:46:34.951379 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df95cbc76-9kdvv" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.109004 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-m9bfp"] Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: E0313 01:46:35.109756 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.109773 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: E0313 01:46:35.109791 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-log" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.109802 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-log" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: E0313 01:46:35.109853 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-api" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.109860 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-api" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: E0313 01:46:35.109870 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.109878 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.110143 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.110180 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" containerName="neutron-api" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.110202 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-httpd" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.110214 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-log" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.111253 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.140491 master-0 kubenswrapper[19170]: I0313 01:46:35.139495 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:46:35.208739 master-0 kubenswrapper[19170]: I0313 01:46:35.207070 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.208739 master-0 kubenswrapper[19170]: I0313 01:46:35.207428 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjb8\" (UniqueName: \"kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.309967 master-0 kubenswrapper[19170]: I0313 01:46:35.309890 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.310865 master-0 kubenswrapper[19170]: I0313 01:46:35.310044 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjb8\" (UniqueName: \"kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.310865 master-0 kubenswrapper[19170]: I0313 01:46:35.310576 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.376737 master-0 kubenswrapper[19170]: I0313 01:46:35.373861 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjb8\" (UniqueName: \"kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8\") pod \"nova-api-db-create-m9bfp\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.390446 master-0 kubenswrapper[19170]: I0313 01:46:35.389259 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m9bfp"] Mar 13 01:46:35.471828 master-0 kubenswrapper[19170]: I0313 01:46:35.471749 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:35.647586 master-0 kubenswrapper[19170]: I0313 01:46:35.645765 19170 scope.go:117] "RemoveContainer" containerID="5e01aa9d899d75b739283495ae642b747304de2ceb477f26cbcca8b0085e0c92" Mar 13 01:46:35.660725 master-0 kubenswrapper[19170]: I0313 01:46:35.657270 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:46:35.694774 master-0 kubenswrapper[19170]: I0313 01:46:35.683876 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5df95cbc76-9kdvv"] Mar 13 01:46:35.728132 master-0 kubenswrapper[19170]: I0313 01:46:35.724622 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.697214207 podStartE2EDuration="23.72460205s" podCreationTimestamp="2026-03-13 01:46:12 +0000 UTC" firstStartedPulling="2026-03-13 01:46:15.20553218 +0000 UTC m=+1636.013653140" lastFinishedPulling="2026-03-13 01:46:34.232920023 +0000 UTC m=+1655.041040983" observedRunningTime="2026-03-13 01:46:35.709884195 +0000 UTC m=+1656.518005155" watchObservedRunningTime="2026-03-13 01:46:35.72460205 +0000 UTC m=+1656.532723010" Mar 13 01:46:35.761721 master-0 kubenswrapper[19170]: I0313 01:46:35.759700 19170 scope.go:117] "RemoveContainer" containerID="44a94b74db9eb44c8ae267432b09deff59745638400632437a0c42658756ff5c" Mar 13 01:46:35.761721 master-0 kubenswrapper[19170]: I0313 01:46:35.759993 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.829597 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830054 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830152 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830192 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830265 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830304 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.830381 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c426k\" (UniqueName: \"kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k\") pod \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\" (UID: \"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7\") " Mar 13 01:46:35.838656 master-0 kubenswrapper[19170]: I0313 01:46:35.832337 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs" (OuterVolumeSpecName: "logs") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.842760 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f778-account-create-update-pg7qf"] Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: E0313 01:46:35.843344 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-api" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.843358 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-api" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: E0313 01:46:35.843390 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-log" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.843398 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-log" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.843654 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-log" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.843685 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" containerName="placement-api" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.845896 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k" (OuterVolumeSpecName: "kube-api-access-c426k") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "kube-api-access-c426k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:35.849713 master-0 kubenswrapper[19170]: I0313 01:46:35.848666 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:35.867654 master-0 kubenswrapper[19170]: I0313 01:46:35.855451 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts" (OuterVolumeSpecName: "scripts") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:35.867654 master-0 kubenswrapper[19170]: I0313 01:46:35.865014 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 01:46:35.867654 master-0 kubenswrapper[19170]: I0313 01:46:35.865252 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f778-account-create-update-pg7qf"] Mar 13 01:46:35.891817 master-0 kubenswrapper[19170]: I0313 01:46:35.891293 19170 scope.go:117] "RemoveContainer" containerID="35334b7bb9a526cd0fec9834e9c7d8199fbdf00ef6f89d2ad12408e67b450d4b" Mar 13 01:46:35.906659 master-0 kubenswrapper[19170]: I0313 01:46:35.903941 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:35.918656 master-0 kubenswrapper[19170]: I0313 01:46:35.918534 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.935844 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.936211 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xfpt\" (UniqueName: \"kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.936402 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.936609 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.936641 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.936654 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c426k\" (UniqueName: \"kubernetes.io/projected/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-kube-api-access-c426k\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.938314 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:35.942655 master-0 kubenswrapper[19170]: I0313 01:46:35.942010 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data" (OuterVolumeSpecName: "config-data") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:35.958867 master-0 kubenswrapper[19170]: I0313 01:46:35.958010 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-external-config-data" Mar 13 01:46:35.958867 master-0 kubenswrapper[19170]: I0313 01:46:35.958204 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 01:46:35.970402 master-0 kubenswrapper[19170]: I0313 01:46:35.969809 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-spbhw"] Mar 13 01:46:35.971472 master-0 kubenswrapper[19170]: I0313 01:46:35.971433 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:35.999857 master-0 kubenswrapper[19170]: I0313 01:46:35.999320 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:36.038450 master-0 kubenswrapper[19170]: I0313 01:46:36.038375 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:36.039257 master-0 kubenswrapper[19170]: I0313 01:46:36.039223 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:36.039312 master-0 kubenswrapper[19170]: I0313 01:46:36.039285 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xfpt\" (UniqueName: \"kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:36.039888 master-0 kubenswrapper[19170]: I0313 01:46:36.039588 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.042689 master-0 kubenswrapper[19170]: I0313 01:46:36.042650 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-spbhw"] Mar 13 01:46:36.043014 master-0 kubenswrapper[19170]: I0313 01:46:36.042967 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" event={"ID":"6ad5e2c1-f186-4654-ad97-73019827ec1f","Type":"ContainerStarted","Data":"a8e88a67773817de3a9147ecf71f7a97719d4078aae8a416902a7892695d25cc"} Mar 13 01:46:36.058664 master-0 kubenswrapper[19170]: I0313 01:46:36.057946 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.077706 master-0 kubenswrapper[19170]: I0313 01:46:36.068906 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xfpt\" (UniqueName: \"kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt\") pod \"nova-api-f778-account-create-update-pg7qf\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:36.096734 master-0 kubenswrapper[19170]: I0313 01:46:36.096670 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z7sjf"] Mar 13 01:46:36.113525 master-0 kubenswrapper[19170]: I0313 01:46:36.098804 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.135252 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z7sjf"] Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.135648 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.139371 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-769f4cdbc8-8mz4m" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.140298 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-769f4cdbc8-8mz4m" event={"ID":"302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7","Type":"ContainerDied","Data":"41dae79279507ab750e879bdb1a3aed5eafacd96cfced3e8d249a548a1a296ad"} Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.140342 19170 scope.go:117] "RemoveContainer" containerID="c5e6417ef0054652c8fb4b74c37596118e7448530a224c9701dcdb46b0eaaeb8" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143270 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143386 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143437 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143463 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143500 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143553 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143589 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmxs\" (UniqueName: \"kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143663 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.143712 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kdc\" (UniqueName: \"kubernetes.io/projected/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-kube-api-access-n4kdc\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.144066 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.144319 19170 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.145490 master-0 kubenswrapper[19170]: I0313 01:46:36.144335 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.181997 master-0 kubenswrapper[19170]: I0313 01:46:36.181033 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" (UID: "302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.243080 master-0 kubenswrapper[19170]: I0313 01:46:36.221936 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fa2f-account-create-update-7k47n"] Mar 13 01:46:36.243080 master-0 kubenswrapper[19170]: I0313 01:46:36.226559 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.243080 master-0 kubenswrapper[19170]: I0313 01:46:36.234956 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247641 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247779 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247816 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247857 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247876 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247903 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.247962 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248000 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khmxs\" (UniqueName: \"kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248060 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248111 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kdc\" (UniqueName: \"kubernetes.io/projected/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-kube-api-access-n4kdc\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248241 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2xgk\" (UniqueName: \"kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248273 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.248414 master-0 kubenswrapper[19170]: I0313 01:46:36.248351 19170 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.249037 master-0 kubenswrapper[19170]: I0313 01:46:36.249001 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-httpd-run\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.251892 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.253079 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-scripts\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.253648 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-combined-ca-bundle\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.253890 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-logs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.256614 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.256676 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/dc901c00d5023825c7bc5ea32a35825a63b53b24b9433470d442fe02f9c29cd0/globalmount\"" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.264095 master-0 kubenswrapper[19170]: I0313 01:46:36.258422 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-config-data\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.274911 master-0 kubenswrapper[19170]: I0313 01:46:36.274877 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-public-tls-certs\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.278328 master-0 kubenswrapper[19170]: I0313 01:46:36.278275 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmxs\" (UniqueName: \"kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs\") pod \"nova-cell0-db-create-spbhw\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.288935 master-0 kubenswrapper[19170]: I0313 01:46:36.288524 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kdc\" (UniqueName: \"kubernetes.io/projected/34ddd7e3-732e-45c1-bbe1-c1193ef1887b-kube-api-access-n4kdc\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:36.308657 master-0 kubenswrapper[19170]: I0313 01:46:36.308574 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:36.316268 master-0 kubenswrapper[19170]: I0313 01:46:36.314872 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fa2f-account-create-update-7k47n"] Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.348925 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.350300 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grb4v\" (UniqueName: \"kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.350364 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.350457 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2xgk\" (UniqueName: \"kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.350619 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.357798 master-0 kubenswrapper[19170]: I0313 01:46:36.352885 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.379679 master-0 kubenswrapper[19170]: I0313 01:46:36.360589 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:36.379679 master-0 kubenswrapper[19170]: I0313 01:46:36.367134 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1f87-account-create-update-c75rq"] Mar 13 01:46:36.379679 master-0 kubenswrapper[19170]: I0313 01:46:36.368888 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2xgk\" (UniqueName: \"kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk\") pod \"nova-cell1-db-create-z7sjf\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.379679 master-0 kubenswrapper[19170]: I0313 01:46:36.369033 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.379679 master-0 kubenswrapper[19170]: I0313 01:46:36.371554 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 01:46:36.408656 master-0 kubenswrapper[19170]: I0313 01:46:36.406750 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:36.408656 master-0 kubenswrapper[19170]: I0313 01:46:36.407210 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f87-account-create-update-c75rq"] Mar 13 01:46:36.452477 master-0 kubenswrapper[19170]: I0313 01:46:36.452409 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.452709 master-0 kubenswrapper[19170]: I0313 01:46:36.452558 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grb4v\" (UniqueName: \"kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.452709 master-0 kubenswrapper[19170]: I0313 01:46:36.452604 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.452709 master-0 kubenswrapper[19170]: I0313 01:46:36.452685 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwk99\" (UniqueName: \"kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.461531 master-0 kubenswrapper[19170]: I0313 01:46:36.461308 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.475257 master-0 kubenswrapper[19170]: I0313 01:46:36.475221 19170 scope.go:117] "RemoveContainer" containerID="eada95478f376b51f72504d808ebcc6bb95b656dd0f9a7257ed10e59c9d8f2f5" Mar 13 01:46:36.485228 master-0 kubenswrapper[19170]: I0313 01:46:36.484106 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:36.485830 master-0 kubenswrapper[19170]: I0313 01:46:36.485591 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grb4v\" (UniqueName: \"kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v\") pod \"nova-cell0-fa2f-account-create-update-7k47n\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.570930 master-0 kubenswrapper[19170]: I0313 01:46:36.568869 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwk99\" (UniqueName: \"kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.570930 master-0 kubenswrapper[19170]: I0313 01:46:36.569103 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.574645 master-0 kubenswrapper[19170]: I0313 01:46:36.572263 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.613816 master-0 kubenswrapper[19170]: I0313 01:46:36.613276 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwk99\" (UniqueName: \"kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99\") pod \"nova-cell1-1f87-account-create-update-c75rq\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.679589 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.682106 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686230 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686371 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686404 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686517 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686542 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s62hg\" (UniqueName: \"kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.686567 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs\") pod \"af162e22-de53-4f80-a7a9-877bda3e9740\" (UID: \"af162e22-de53-4f80-a7a9-877bda3e9740\") " Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.687248 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.688438 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs" (OuterVolumeSpecName: "logs") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.689439 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.693677 master-0 kubenswrapper[19170]: I0313 01:46:36.689456 19170 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/af162e22-de53-4f80-a7a9-877bda3e9740-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.711658 master-0 kubenswrapper[19170]: I0313 01:46:36.699899 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m9bfp"] Mar 13 01:46:36.740357 master-0 kubenswrapper[19170]: I0313 01:46:36.712078 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts" (OuterVolumeSpecName: "scripts") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.740357 master-0 kubenswrapper[19170]: I0313 01:46:36.712247 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg" (OuterVolumeSpecName: "kube-api-access-s62hg") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "kube-api-access-s62hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:36.740357 master-0 kubenswrapper[19170]: I0313 01:46:36.720863 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:46:36.740611 master-0 kubenswrapper[19170]: I0313 01:46:36.740466 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:36.745659 master-0 kubenswrapper[19170]: I0313 01:46:36.741149 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-769f4cdbc8-8mz4m"] Mar 13 01:46:36.775668 master-0 kubenswrapper[19170]: I0313 01:46:36.774726 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.794129 master-0 kubenswrapper[19170]: I0313 01:46:36.792428 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.794129 master-0 kubenswrapper[19170]: I0313 01:46:36.792519 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s62hg\" (UniqueName: \"kubernetes.io/projected/af162e22-de53-4f80-a7a9-877bda3e9740-kube-api-access-s62hg\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.794129 master-0 kubenswrapper[19170]: I0313 01:46:36.792531 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.794129 master-0 kubenswrapper[19170]: I0313 01:46:36.792744 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:36.798647 master-0 kubenswrapper[19170]: I0313 01:46:36.797901 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data" (OuterVolumeSpecName: "config-data") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.851369 master-0 kubenswrapper[19170]: I0313 01:46:36.849737 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:36.897306 master-0 kubenswrapper[19170]: I0313 01:46:36.896614 19170 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:36.897306 master-0 kubenswrapper[19170]: I0313 01:46:36.896666 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af162e22-de53-4f80-a7a9-877bda3e9740-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:37.225791 master-0 kubenswrapper[19170]: I0313 01:46:37.224883 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"af162e22-de53-4f80-a7a9-877bda3e9740","Type":"ContainerDied","Data":"8ed4b668ae7343ab5625ddc5e12a1073ba8b3211e8548abb39d41a344a4a0cd8"} Mar 13 01:46:37.225791 master-0 kubenswrapper[19170]: I0313 01:46:37.224966 19170 scope.go:117] "RemoveContainer" containerID="576b62567482197d09934570c36b691fedbd52b01043c1e3d7665d2eada20ed5" Mar 13 01:46:37.225791 master-0 kubenswrapper[19170]: I0313 01:46:37.225127 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.237809 master-0 kubenswrapper[19170]: I0313 01:46:37.237260 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f778-account-create-update-pg7qf"] Mar 13 01:46:37.250340 master-0 kubenswrapper[19170]: I0313 01:46:37.249602 19170 generic.go:334] "Generic (PLEG): container finished" podID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerID="3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f" exitCode=0 Mar 13 01:46:37.250340 master-0 kubenswrapper[19170]: I0313 01:46:37.249673 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" event={"ID":"6ad5e2c1-f186-4654-ad97-73019827ec1f","Type":"ContainerDied","Data":"3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f"} Mar 13 01:46:37.257981 master-0 kubenswrapper[19170]: I0313 01:46:37.257691 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"bf29f6df81130ede3f82b51b9d90996b98a4c3ab96fba8c55db9abbe16dc7d4b"} Mar 13 01:46:37.264755 master-0 kubenswrapper[19170]: I0313 01:46:37.264699 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c4801b65-cb4e-4393-9a89-3a29c3051310","Type":"ContainerStarted","Data":"2a302803233523a41c50fbd6d880168b1ee5a451cb78b5ee8b49cf2b6c918c8f"} Mar 13 01:46:37.275974 master-0 kubenswrapper[19170]: I0313 01:46:37.273221 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m9bfp" event={"ID":"4131c57a-5598-4c4f-b4d1-ee12368c9b85","Type":"ContainerStarted","Data":"f5fbf07889b794113240aa690148fe4458e66f3eaa6be3a193a1e7307d585b62"} Mar 13 01:46:37.429043 master-0 kubenswrapper[19170]: I0313 01:46:37.426597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c" (OuterVolumeSpecName: "glance") pod "af162e22-de53-4f80-a7a9-877bda3e9740" (UID: "af162e22-de53-4f80-a7a9-877bda3e9740"). InnerVolumeSpecName "pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 01:46:37.457484 master-0 kubenswrapper[19170]: I0313 01:46:37.457202 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28caadd7-b003-4cb7-83d7-508a641ad738" path="/var/lib/kubelet/pods/28caadd7-b003-4cb7-83d7-508a641ad738/volumes" Mar 13 01:46:37.458639 master-0 kubenswrapper[19170]: I0313 01:46:37.457863 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7" path="/var/lib/kubelet/pods/302c0aa3-a4a7-4e8f-85d6-cbd7fd4580b7/volumes" Mar 13 01:46:37.459671 master-0 kubenswrapper[19170]: I0313 01:46:37.458895 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" path="/var/lib/kubelet/pods/979e6bc4-2aa2-4326-b7f2-c45f50b41c28/volumes" Mar 13 01:46:37.506578 master-0 kubenswrapper[19170]: I0313 01:46:37.506535 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-60c39735-cd4e-4baf-8a95-3babad891e79\" (UniqueName: \"kubernetes.io/csi/topolvm.io^deb39bd4-fea6-43ec-aeb4-a117c91529eb\") pod \"glance-b9844-default-external-api-0\" (UID: \"34ddd7e3-732e-45c1-bbe1-c1193ef1887b\") " pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:37.521903 master-0 kubenswrapper[19170]: I0313 01:46:37.520572 19170 scope.go:117] "RemoveContainer" containerID="c632611af070ea18c1e344e383398eb132439553e422bc44171a9cf4e56b8ec1" Mar 13 01:46:37.523025 master-0 kubenswrapper[19170]: I0313 01:46:37.522991 19170 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") on node \"master-0\" " Mar 13 01:46:37.543686 master-0 kubenswrapper[19170]: I0313 01:46:37.542898 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:37.593655 master-0 kubenswrapper[19170]: I0313 01:46:37.584592 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:37.636652 master-0 kubenswrapper[19170]: I0313 01:46:37.628154 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:37.639898 master-0 kubenswrapper[19170]: I0313 01:46:37.639586 19170 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 01:46:37.639898 master-0 kubenswrapper[19170]: I0313 01:46:37.639831 19170 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb" (UniqueName: "kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c") on node "master-0" Mar 13 01:46:37.669503 master-0 kubenswrapper[19170]: I0313 01:46:37.662176 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:37.707417 master-0 kubenswrapper[19170]: I0313 01:46:37.707149 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: E0313 01:46:37.707979 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-httpd" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: I0313 01:46:37.708007 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-httpd" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: E0313 01:46:37.708023 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-log" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: I0313 01:46:37.708032 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-log" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: I0313 01:46:37.708365 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-log" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: I0313 01:46:37.708425 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" containerName="glance-httpd" Mar 13 01:46:37.711662 master-0 kubenswrapper[19170]: I0313 01:46:37.710748 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.715298 master-0 kubenswrapper[19170]: I0313 01:46:37.713448 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 01:46:37.715298 master-0 kubenswrapper[19170]: I0313 01:46:37.713785 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b9844-default-internal-config-data" Mar 13 01:46:37.737404 master-0 kubenswrapper[19170]: I0313 01:46:37.736457 19170 reconciler_common.go:293] "Volume detached for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:37.802410 master-0 kubenswrapper[19170]: I0313 01:46:37.802368 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:37.829274 master-0 kubenswrapper[19170]: I0313 01:46:37.829227 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z7sjf"] Mar 13 01:46:37.862522 master-0 kubenswrapper[19170]: I0313 01:46:37.862450 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-spbhw"] Mar 13 01:46:37.885576 master-0 kubenswrapper[19170]: I0313 01:46:37.885528 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886002 master-0 kubenswrapper[19170]: I0313 01:46:37.885963 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886163 master-0 kubenswrapper[19170]: I0313 01:46:37.886139 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ptm\" (UniqueName: \"kubernetes.io/projected/4bccd73a-0337-40f3-847d-0953889fee13-kube-api-access-42ptm\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886320 master-0 kubenswrapper[19170]: I0313 01:46:37.886300 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886467 master-0 kubenswrapper[19170]: I0313 01:46:37.886445 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886549 master-0 kubenswrapper[19170]: I0313 01:46:37.886528 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886599 master-0 kubenswrapper[19170]: I0313 01:46:37.886588 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.886865 master-0 kubenswrapper[19170]: I0313 01:46:37.886622 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.917028 master-0 kubenswrapper[19170]: I0313 01:46:37.914738 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fa2f-account-create-update-7k47n"] Mar 13 01:46:37.956044 master-0 kubenswrapper[19170]: I0313 01:46:37.955756 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1f87-account-create-update-c75rq"] Mar 13 01:46:37.989241 master-0 kubenswrapper[19170]: I0313 01:46:37.989196 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989254 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989279 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989379 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989436 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989468 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ptm\" (UniqueName: \"kubernetes.io/projected/4bccd73a-0337-40f3-847d-0953889fee13-kube-api-access-42ptm\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989486 master-0 kubenswrapper[19170]: I0313 01:46:37.989491 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.989714 master-0 kubenswrapper[19170]: I0313 01:46:37.989538 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.992175 master-0 kubenswrapper[19170]: I0313 01:46:37.989740 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-logs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.996034 master-0 kubenswrapper[19170]: I0313 01:46:37.995931 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4bccd73a-0337-40f3-847d-0953889fee13-httpd-run\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:37.997446 master-0 kubenswrapper[19170]: I0313 01:46:37.997396 19170 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 01:46:37.997502 master-0 kubenswrapper[19170]: I0313 01:46:37.997444 19170 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/34cbbf4fc438e2d42966b02d86ef558a7de843a3eec70be4281d747be1ba2c15/globalmount\"" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.000016 master-0 kubenswrapper[19170]: E0313 01:46:37.999702 19170 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4131c57a_5598_4c4f_b4d1_ee12368c9b85.slice/crio-db8e69d331dd9dc3092ae31dd4ce594b248b781c06c225dff5ef3d1783eadde4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4131c57a_5598_4c4f_b4d1_ee12368c9b85.slice/crio-conmon-db8e69d331dd9dc3092ae31dd4ce594b248b781c06c225dff5ef3d1783eadde4.scope\": RecentStats: unable to find data in memory cache]" Mar 13 01:46:38.006564 master-0 kubenswrapper[19170]: I0313 01:46:38.006527 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-combined-ca-bundle\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.052789 master-0 kubenswrapper[19170]: I0313 01:46:38.052609 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-config-data\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.057063 master-0 kubenswrapper[19170]: I0313 01:46:38.056274 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-internal-tls-certs\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.059888 master-0 kubenswrapper[19170]: I0313 01:46:38.059852 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ptm\" (UniqueName: \"kubernetes.io/projected/4bccd73a-0337-40f3-847d-0953889fee13-kube-api-access-42ptm\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.061355 master-0 kubenswrapper[19170]: I0313 01:46:38.061237 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bccd73a-0337-40f3-847d-0953889fee13-scripts\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:38.083143 master-0 kubenswrapper[19170]: I0313 01:46:38.083033 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-84f79fc9fc-7t7j5" Mar 13 01:46:38.418650 master-0 kubenswrapper[19170]: I0313 01:46:38.414888 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-spbhw" event={"ID":"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57","Type":"ContainerStarted","Data":"be63febd927fea4680b451a8523fa9d6768ea86ec0157001baf703de4aaa1b05"} Mar 13 01:46:38.418650 master-0 kubenswrapper[19170]: I0313 01:46:38.414950 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-spbhw" event={"ID":"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57","Type":"ContainerStarted","Data":"86fc317be8e884a97a748ceee2606eab4def89ba78f530cffc6c1afa0cef0d75"} Mar 13 01:46:38.439809 master-0 kubenswrapper[19170]: I0313 01:46:38.439777 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-external-api-0"] Mar 13 01:46:38.440886 master-0 kubenswrapper[19170]: I0313 01:46:38.440601 19170 generic.go:334] "Generic (PLEG): container finished" podID="4131c57a-5598-4c4f-b4d1-ee12368c9b85" containerID="db8e69d331dd9dc3092ae31dd4ce594b248b781c06c225dff5ef3d1783eadde4" exitCode=0 Mar 13 01:46:38.440886 master-0 kubenswrapper[19170]: I0313 01:46:38.440668 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m9bfp" event={"ID":"4131c57a-5598-4c4f-b4d1-ee12368c9b85","Type":"ContainerDied","Data":"db8e69d331dd9dc3092ae31dd4ce594b248b781c06c225dff5ef3d1783eadde4"} Mar 13 01:46:38.460377 master-0 kubenswrapper[19170]: I0313 01:46:38.460311 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-spbhw" podStartSLOduration=3.460291382 podStartE2EDuration="3.460291382s" podCreationTimestamp="2026-03-13 01:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:38.454155599 +0000 UTC m=+1659.262276549" watchObservedRunningTime="2026-03-13 01:46:38.460291382 +0000 UTC m=+1659.268412342" Mar 13 01:46:38.476829 master-0 kubenswrapper[19170]: I0313 01:46:38.476784 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" event={"ID":"6ad5e2c1-f186-4654-ad97-73019827ec1f","Type":"ContainerStarted","Data":"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5"} Mar 13 01:46:38.477498 master-0 kubenswrapper[19170]: I0313 01:46:38.477464 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:38.512986 master-0 kubenswrapper[19170]: I0313 01:46:38.512832 19170 generic.go:334] "Generic (PLEG): container finished" podID="3a58ff59-00a5-4bda-9e4b-205d2628eb32" containerID="73d0787aee7ff647ced5382f67af8b6d3ad506cc12c42039a2d8c98656fb212e" exitCode=0 Mar 13 01:46:38.512986 master-0 kubenswrapper[19170]: I0313 01:46:38.512939 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f778-account-create-update-pg7qf" event={"ID":"3a58ff59-00a5-4bda-9e4b-205d2628eb32","Type":"ContainerDied","Data":"73d0787aee7ff647ced5382f67af8b6d3ad506cc12c42039a2d8c98656fb212e"} Mar 13 01:46:38.513730 master-0 kubenswrapper[19170]: I0313 01:46:38.513281 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f778-account-create-update-pg7qf" event={"ID":"3a58ff59-00a5-4bda-9e4b-205d2628eb32","Type":"ContainerStarted","Data":"7278222655ace93d39943257478d7bfb10a74ff92671b74d897ee0ae691bd814"} Mar 13 01:46:38.522401 master-0 kubenswrapper[19170]: I0313 01:46:38.522269 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" event={"ID":"efb9dbc0-c8c7-428a-905d-5023a309a3ac","Type":"ContainerStarted","Data":"0c6bd1561f1d5d67fb94494630dcca7ed02cd8efece751aa961a13d56c9d805e"} Mar 13 01:46:38.522778 master-0 kubenswrapper[19170]: I0313 01:46:38.522608 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" event={"ID":"efb9dbc0-c8c7-428a-905d-5023a309a3ac","Type":"ContainerStarted","Data":"3e44ccd5a5cc5f6075661745d60c87c3bf61f58b05a434f8f518d896fef4b0c3"} Mar 13 01:46:38.526854 master-0 kubenswrapper[19170]: I0313 01:46:38.526784 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" podStartSLOduration=8.526764635 podStartE2EDuration="8.526764635s" podCreationTimestamp="2026-03-13 01:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:38.510517547 +0000 UTC m=+1659.318638507" watchObservedRunningTime="2026-03-13 01:46:38.526764635 +0000 UTC m=+1659.334885595" Mar 13 01:46:38.533948 master-0 kubenswrapper[19170]: I0313 01:46:38.532831 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" event={"ID":"068e68e4-8b54-41da-b797-9eee422826b2","Type":"ContainerStarted","Data":"e433856d69b11be57cbd3104d776240d6e8666dcb89493f7403ccde45d50fd52"} Mar 13 01:46:38.533948 master-0 kubenswrapper[19170]: I0313 01:46:38.532879 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" event={"ID":"068e68e4-8b54-41da-b797-9eee422826b2","Type":"ContainerStarted","Data":"ac87252d7bd0d8ff268b07b79a359a6faffa609a350205ba15907915fd6f2c0e"} Mar 13 01:46:38.540934 master-0 kubenswrapper[19170]: I0313 01:46:38.540807 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z7sjf" event={"ID":"f9a8b821-9639-4744-8744-a38e694de2de","Type":"ContainerStarted","Data":"64d4dfb3566c7246daa255e99c5dbedbd5a1c688f068e92b3a553081bc65c14e"} Mar 13 01:46:38.542237 master-0 kubenswrapper[19170]: I0313 01:46:38.542187 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z7sjf" event={"ID":"f9a8b821-9639-4744-8744-a38e694de2de","Type":"ContainerStarted","Data":"15de64c2db3f7c036f47fd2dfd0abf0001c86a4e2bbe1de243a49bc2a49487eb"} Mar 13 01:46:38.556519 master-0 kubenswrapper[19170]: I0313 01:46:38.556469 19170 generic.go:334] "Generic (PLEG): container finished" podID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerID="2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c" exitCode=0 Mar 13 01:46:38.556778 master-0 kubenswrapper[19170]: I0313 01:46:38.556529 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c4801b65-cb4e-4393-9a89-3a29c3051310","Type":"ContainerDied","Data":"2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c"} Mar 13 01:46:38.589553 master-0 kubenswrapper[19170]: I0313 01:46:38.589239 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" podStartSLOduration=3.589220466 podStartE2EDuration="3.589220466s" podCreationTimestamp="2026-03-13 01:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:38.558670405 +0000 UTC m=+1659.366791365" watchObservedRunningTime="2026-03-13 01:46:38.589220466 +0000 UTC m=+1659.397341426" Mar 13 01:46:38.621004 master-0 kubenswrapper[19170]: I0313 01:46:38.620900 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" podStartSLOduration=2.620880278 podStartE2EDuration="2.620880278s" podCreationTimestamp="2026-03-13 01:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:38.575649033 +0000 UTC m=+1659.383769993" watchObservedRunningTime="2026-03-13 01:46:38.620880278 +0000 UTC m=+1659.429001238" Mar 13 01:46:38.628288 master-0 kubenswrapper[19170]: I0313 01:46:38.628230 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-z7sjf" podStartSLOduration=3.628212645 podStartE2EDuration="3.628212645s" podCreationTimestamp="2026-03-13 01:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:38.624003036 +0000 UTC m=+1659.432123996" watchObservedRunningTime="2026-03-13 01:46:38.628212645 +0000 UTC m=+1659.436333605" Mar 13 01:46:39.180845 master-0 kubenswrapper[19170]: I0313 01:46:39.180796 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0423f75b-fda4-4619-84ea-2ae13248e4cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82ad30d6-a170-4ee2-bd43-8bf3a311ef5c\") pod \"glance-b9844-default-internal-api-0\" (UID: \"4bccd73a-0337-40f3-847d-0953889fee13\") " pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:39.444713 master-0 kubenswrapper[19170]: I0313 01:46:39.441091 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af162e22-de53-4f80-a7a9-877bda3e9740" path="/var/lib/kubelet/pods/af162e22-de53-4f80-a7a9-877bda3e9740/volumes" Mar 13 01:46:39.480336 master-0 kubenswrapper[19170]: I0313 01:46:39.480288 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:39.591458 master-0 kubenswrapper[19170]: I0313 01:46:39.590850 19170 generic.go:334] "Generic (PLEG): container finished" podID="f9a8b821-9639-4744-8744-a38e694de2de" containerID="64d4dfb3566c7246daa255e99c5dbedbd5a1c688f068e92b3a553081bc65c14e" exitCode=0 Mar 13 01:46:39.591458 master-0 kubenswrapper[19170]: I0313 01:46:39.590918 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z7sjf" event={"ID":"f9a8b821-9639-4744-8744-a38e694de2de","Type":"ContainerDied","Data":"64d4dfb3566c7246daa255e99c5dbedbd5a1c688f068e92b3a553081bc65c14e"} Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.595876 19170 generic.go:334] "Generic (PLEG): container finished" podID="efb9dbc0-c8c7-428a-905d-5023a309a3ac" containerID="0c6bd1561f1d5d67fb94494630dcca7ed02cd8efece751aa961a13d56c9d805e" exitCode=0 Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.595958 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" event={"ID":"efb9dbc0-c8c7-428a-905d-5023a309a3ac","Type":"ContainerDied","Data":"0c6bd1561f1d5d67fb94494630dcca7ed02cd8efece751aa961a13d56c9d805e"} Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.599000 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"34ddd7e3-732e-45c1-bbe1-c1193ef1887b","Type":"ContainerStarted","Data":"e0a3092178172e75b9cce31c97c00a2ede9b72fbaf03ce5b6190cc484f0d3cd0"} Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.599048 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"34ddd7e3-732e-45c1-bbe1-c1193ef1887b","Type":"ContainerStarted","Data":"d50125f084ee52e7330b858304c2c8e134cc15bb7967eeb839d62bad57421a44"} Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.600872 19170 generic.go:334] "Generic (PLEG): container finished" podID="55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" containerID="be63febd927fea4680b451a8523fa9d6768ea86ec0157001baf703de4aaa1b05" exitCode=0 Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.600926 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-spbhw" event={"ID":"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57","Type":"ContainerDied","Data":"be63febd927fea4680b451a8523fa9d6768ea86ec0157001baf703de4aaa1b05"} Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.602373 19170 generic.go:334] "Generic (PLEG): container finished" podID="068e68e4-8b54-41da-b797-9eee422826b2" containerID="e433856d69b11be57cbd3104d776240d6e8666dcb89493f7403ccde45d50fd52" exitCode=0 Mar 13 01:46:39.602869 master-0 kubenswrapper[19170]: I0313 01:46:39.602564 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" event={"ID":"068e68e4-8b54-41da-b797-9eee422826b2","Type":"ContainerDied","Data":"e433856d69b11be57cbd3104d776240d6e8666dcb89493f7403ccde45d50fd52"} Mar 13 01:46:40.324455 master-0 kubenswrapper[19170]: I0313 01:46:40.324320 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:40.350923 master-0 kubenswrapper[19170]: I0313 01:46:40.350618 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b9844-default-internal-api-0"] Mar 13 01:46:40.362697 master-0 kubenswrapper[19170]: W0313 01:46:40.360814 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bccd73a_0337_40f3_847d_0953889fee13.slice/crio-0e06e108925296cc53d9bd5d27840f2a293c43cf0c6148c5797efa6861d92083 WatchSource:0}: Error finding container 0e06e108925296cc53d9bd5d27840f2a293c43cf0c6148c5797efa6861d92083: Status 404 returned error can't find the container with id 0e06e108925296cc53d9bd5d27840f2a293c43cf0c6148c5797efa6861d92083 Mar 13 01:46:40.368328 master-0 kubenswrapper[19170]: I0313 01:46:40.368287 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:40.422714 master-0 kubenswrapper[19170]: I0313 01:46:40.422127 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts\") pod \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " Mar 13 01:46:40.422714 master-0 kubenswrapper[19170]: I0313 01:46:40.422295 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdjb8\" (UniqueName: \"kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8\") pod \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\" (UID: \"4131c57a-5598-4c4f-b4d1-ee12368c9b85\") " Mar 13 01:46:40.422714 master-0 kubenswrapper[19170]: I0313 01:46:40.422563 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4131c57a-5598-4c4f-b4d1-ee12368c9b85" (UID: "4131c57a-5598-4c4f-b4d1-ee12368c9b85"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:40.422714 master-0 kubenswrapper[19170]: I0313 01:46:40.422715 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts\") pod \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " Mar 13 01:46:40.423027 master-0 kubenswrapper[19170]: I0313 01:46:40.422821 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfpt\" (UniqueName: \"kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt\") pod \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\" (UID: \"3a58ff59-00a5-4bda-9e4b-205d2628eb32\") " Mar 13 01:46:40.423146 master-0 kubenswrapper[19170]: I0313 01:46:40.423107 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a58ff59-00a5-4bda-9e4b-205d2628eb32" (UID: "3a58ff59-00a5-4bda-9e4b-205d2628eb32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:40.443691 master-0 kubenswrapper[19170]: I0313 01:46:40.426874 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8" (OuterVolumeSpecName: "kube-api-access-pdjb8") pod "4131c57a-5598-4c4f-b4d1-ee12368c9b85" (UID: "4131c57a-5598-4c4f-b4d1-ee12368c9b85"). InnerVolumeSpecName "kube-api-access-pdjb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:40.443691 master-0 kubenswrapper[19170]: I0313 01:46:40.435800 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4131c57a-5598-4c4f-b4d1-ee12368c9b85-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:40.443691 master-0 kubenswrapper[19170]: I0313 01:46:40.435881 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdjb8\" (UniqueName: \"kubernetes.io/projected/4131c57a-5598-4c4f-b4d1-ee12368c9b85-kube-api-access-pdjb8\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:40.443691 master-0 kubenswrapper[19170]: I0313 01:46:40.435895 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a58ff59-00a5-4bda-9e4b-205d2628eb32-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:40.447146 master-0 kubenswrapper[19170]: I0313 01:46:40.447073 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt" (OuterVolumeSpecName: "kube-api-access-9xfpt") pod "3a58ff59-00a5-4bda-9e4b-205d2628eb32" (UID: "3a58ff59-00a5-4bda-9e4b-205d2628eb32"). InnerVolumeSpecName "kube-api-access-9xfpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:40.538737 master-0 kubenswrapper[19170]: I0313 01:46:40.538066 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfpt\" (UniqueName: \"kubernetes.io/projected/3a58ff59-00a5-4bda-9e4b-205d2628eb32-kube-api-access-9xfpt\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.613809 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m9bfp" Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.613807 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m9bfp" event={"ID":"4131c57a-5598-4c4f-b4d1-ee12368c9b85","Type":"ContainerDied","Data":"f5fbf07889b794113240aa690148fe4458e66f3eaa6be3a193a1e7307d585b62"} Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.614016 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5fbf07889b794113240aa690148fe4458e66f3eaa6be3a193a1e7307d585b62" Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.615448 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f778-account-create-update-pg7qf" event={"ID":"3a58ff59-00a5-4bda-9e4b-205d2628eb32","Type":"ContainerDied","Data":"7278222655ace93d39943257478d7bfb10a74ff92671b74d897ee0ae691bd814"} Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.615469 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7278222655ace93d39943257478d7bfb10a74ff92671b74d897ee0ae691bd814" Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.615521 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f778-account-create-update-pg7qf" Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.621010 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"4bccd73a-0337-40f3-847d-0953889fee13","Type":"ContainerStarted","Data":"0e06e108925296cc53d9bd5d27840f2a293c43cf0c6148c5797efa6861d92083"} Mar 13 01:46:40.643128 master-0 kubenswrapper[19170]: I0313 01:46:40.624716 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-external-api-0" event={"ID":"34ddd7e3-732e-45c1-bbe1-c1193ef1887b","Type":"ContainerStarted","Data":"017bfc44ad2263432fa2dc6dd74e15d11fb1c61d466e909553cdbae8a723105c"} Mar 13 01:46:40.685688 master-0 kubenswrapper[19170]: I0313 01:46:40.682440 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-external-api-0" podStartSLOduration=5.682421637 podStartE2EDuration="5.682421637s" podCreationTimestamp="2026-03-13 01:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:40.647774351 +0000 UTC m=+1661.455895311" watchObservedRunningTime="2026-03-13 01:46:40.682421637 +0000 UTC m=+1661.490542587" Mar 13 01:46:41.179806 master-0 kubenswrapper[19170]: I0313 01:46:41.179767 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:41.372651 master-0 kubenswrapper[19170]: I0313 01:46:41.366578 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwk99\" (UniqueName: \"kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99\") pod \"068e68e4-8b54-41da-b797-9eee422826b2\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " Mar 13 01:46:41.372651 master-0 kubenswrapper[19170]: I0313 01:46:41.366709 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts\") pod \"068e68e4-8b54-41da-b797-9eee422826b2\" (UID: \"068e68e4-8b54-41da-b797-9eee422826b2\") " Mar 13 01:46:41.372651 master-0 kubenswrapper[19170]: I0313 01:46:41.367530 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "068e68e4-8b54-41da-b797-9eee422826b2" (UID: "068e68e4-8b54-41da-b797-9eee422826b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:41.418284 master-0 kubenswrapper[19170]: I0313 01:46:41.418225 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99" (OuterVolumeSpecName: "kube-api-access-jwk99") pod "068e68e4-8b54-41da-b797-9eee422826b2" (UID: "068e68e4-8b54-41da-b797-9eee422826b2"). InnerVolumeSpecName "kube-api-access-jwk99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:41.451739 master-0 kubenswrapper[19170]: I0313 01:46:41.451238 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:41.459659 master-0 kubenswrapper[19170]: I0313 01:46:41.459592 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:41.472270 master-0 kubenswrapper[19170]: I0313 01:46:41.472220 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwk99\" (UniqueName: \"kubernetes.io/projected/068e68e4-8b54-41da-b797-9eee422826b2-kube-api-access-jwk99\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.472270 master-0 kubenswrapper[19170]: I0313 01:46:41.472263 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068e68e4-8b54-41da-b797-9eee422826b2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.509521 master-0 kubenswrapper[19170]: I0313 01:46:41.509488 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:41.573901 master-0 kubenswrapper[19170]: I0313 01:46:41.573848 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts\") pod \"f9a8b821-9639-4744-8744-a38e694de2de\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " Mar 13 01:46:41.574095 master-0 kubenswrapper[19170]: I0313 01:46:41.573920 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts\") pod \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " Mar 13 01:46:41.574095 master-0 kubenswrapper[19170]: I0313 01:46:41.573979 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2xgk\" (UniqueName: \"kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk\") pod \"f9a8b821-9639-4744-8744-a38e694de2de\" (UID: \"f9a8b821-9639-4744-8744-a38e694de2de\") " Mar 13 01:46:41.574174 master-0 kubenswrapper[19170]: I0313 01:46:41.574148 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khmxs\" (UniqueName: \"kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs\") pod \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\" (UID: \"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57\") " Mar 13 01:46:41.574588 master-0 kubenswrapper[19170]: I0313 01:46:41.574426 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" (UID: "55081cb7-5148-4dd9-8d2c-14cb7a5b7a57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:41.574828 master-0 kubenswrapper[19170]: I0313 01:46:41.574786 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9a8b821-9639-4744-8744-a38e694de2de" (UID: "f9a8b821-9639-4744-8744-a38e694de2de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:41.575595 master-0 kubenswrapper[19170]: I0313 01:46:41.575534 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a8b821-9639-4744-8744-a38e694de2de-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.575766 master-0 kubenswrapper[19170]: I0313 01:46:41.575597 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.581937 master-0 kubenswrapper[19170]: I0313 01:46:41.581894 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk" (OuterVolumeSpecName: "kube-api-access-m2xgk") pod "f9a8b821-9639-4744-8744-a38e694de2de" (UID: "f9a8b821-9639-4744-8744-a38e694de2de"). InnerVolumeSpecName "kube-api-access-m2xgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:41.616063 master-0 kubenswrapper[19170]: I0313 01:46:41.616002 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs" (OuterVolumeSpecName: "kube-api-access-khmxs") pod "55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" (UID: "55081cb7-5148-4dd9-8d2c-14cb7a5b7a57"). InnerVolumeSpecName "kube-api-access-khmxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:41.656139 master-0 kubenswrapper[19170]: I0313 01:46:41.656086 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-spbhw" event={"ID":"55081cb7-5148-4dd9-8d2c-14cb7a5b7a57","Type":"ContainerDied","Data":"86fc317be8e884a97a748ceee2606eab4def89ba78f530cffc6c1afa0cef0d75"} Mar 13 01:46:41.656139 master-0 kubenswrapper[19170]: I0313 01:46:41.656136 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fc317be8e884a97a748ceee2606eab4def89ba78f530cffc6c1afa0cef0d75" Mar 13 01:46:41.656364 master-0 kubenswrapper[19170]: I0313 01:46:41.656209 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-spbhw" Mar 13 01:46:41.661878 master-0 kubenswrapper[19170]: I0313 01:46:41.661815 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" event={"ID":"068e68e4-8b54-41da-b797-9eee422826b2","Type":"ContainerDied","Data":"ac87252d7bd0d8ff268b07b79a359a6faffa609a350205ba15907915fd6f2c0e"} Mar 13 01:46:41.661878 master-0 kubenswrapper[19170]: I0313 01:46:41.661875 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac87252d7bd0d8ff268b07b79a359a6faffa609a350205ba15907915fd6f2c0e" Mar 13 01:46:41.662029 master-0 kubenswrapper[19170]: I0313 01:46:41.661936 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1f87-account-create-update-c75rq" Mar 13 01:46:41.674410 master-0 kubenswrapper[19170]: I0313 01:46:41.674362 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z7sjf" event={"ID":"f9a8b821-9639-4744-8744-a38e694de2de","Type":"ContainerDied","Data":"15de64c2db3f7c036f47fd2dfd0abf0001c86a4e2bbe1de243a49bc2a49487eb"} Mar 13 01:46:41.674410 master-0 kubenswrapper[19170]: I0313 01:46:41.674407 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15de64c2db3f7c036f47fd2dfd0abf0001c86a4e2bbe1de243a49bc2a49487eb" Mar 13 01:46:41.674537 master-0 kubenswrapper[19170]: I0313 01:46:41.674487 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z7sjf" Mar 13 01:46:41.678459 master-0 kubenswrapper[19170]: I0313 01:46:41.678314 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts\") pod \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " Mar 13 01:46:41.678459 master-0 kubenswrapper[19170]: I0313 01:46:41.678415 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grb4v\" (UniqueName: \"kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v\") pod \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\" (UID: \"efb9dbc0-c8c7-428a-905d-5023a309a3ac\") " Mar 13 01:46:41.679111 master-0 kubenswrapper[19170]: I0313 01:46:41.679078 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb9dbc0-c8c7-428a-905d-5023a309a3ac" (UID: "efb9dbc0-c8c7-428a-905d-5023a309a3ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:41.679482 master-0 kubenswrapper[19170]: I0313 01:46:41.679210 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2xgk\" (UniqueName: \"kubernetes.io/projected/f9a8b821-9639-4744-8744-a38e694de2de-kube-api-access-m2xgk\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.679580 master-0 kubenswrapper[19170]: I0313 01:46:41.679566 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khmxs\" (UniqueName: \"kubernetes.io/projected/55081cb7-5148-4dd9-8d2c-14cb7a5b7a57-kube-api-access-khmxs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.682741 master-0 kubenswrapper[19170]: I0313 01:46:41.682699 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"4bccd73a-0337-40f3-847d-0953889fee13","Type":"ContainerStarted","Data":"5f456b272e20b6b9a7515350f18f1d98804ec362ff8328aac0b8bd9c86be3be2"} Mar 13 01:46:41.683597 master-0 kubenswrapper[19170]: I0313 01:46:41.683564 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v" (OuterVolumeSpecName: "kube-api-access-grb4v") pod "efb9dbc0-c8c7-428a-905d-5023a309a3ac" (UID: "efb9dbc0-c8c7-428a-905d-5023a309a3ac"). InnerVolumeSpecName "kube-api-access-grb4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:41.684585 master-0 kubenswrapper[19170]: I0313 01:46:41.684572 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" Mar 13 01:46:41.685438 master-0 kubenswrapper[19170]: I0313 01:46:41.684693 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fa2f-account-create-update-7k47n" event={"ID":"efb9dbc0-c8c7-428a-905d-5023a309a3ac","Type":"ContainerDied","Data":"3e44ccd5a5cc5f6075661745d60c87c3bf61f58b05a434f8f518d896fef4b0c3"} Mar 13 01:46:41.685501 master-0 kubenswrapper[19170]: I0313 01:46:41.685442 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e44ccd5a5cc5f6075661745d60c87c3bf61f58b05a434f8f518d896fef4b0c3" Mar 13 01:46:41.782593 master-0 kubenswrapper[19170]: I0313 01:46:41.782407 19170 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb9dbc0-c8c7-428a-905d-5023a309a3ac-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:41.782593 master-0 kubenswrapper[19170]: I0313 01:46:41.782442 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grb4v\" (UniqueName: \"kubernetes.io/projected/efb9dbc0-c8c7-428a-905d-5023a309a3ac-kube-api-access-grb4v\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:42.728337 master-0 kubenswrapper[19170]: I0313 01:46:42.728277 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b9844-default-internal-api-0" event={"ID":"4bccd73a-0337-40f3-847d-0953889fee13","Type":"ContainerStarted","Data":"b0403327510724a187ba792a54c9b84e73ecca476ab8ee676f8f5f645274fc7f"} Mar 13 01:46:42.770145 master-0 kubenswrapper[19170]: I0313 01:46:42.763866 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b9844-default-internal-api-0" podStartSLOduration=5.763847178 podStartE2EDuration="5.763847178s" podCreationTimestamp="2026-03-13 01:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:42.758504337 +0000 UTC m=+1663.566625297" watchObservedRunningTime="2026-03-13 01:46:42.763847178 +0000 UTC m=+1663.571968138" Mar 13 01:46:44.763227 master-0 kubenswrapper[19170]: I0313 01:46:44.763101 19170 generic.go:334] "Generic (PLEG): container finished" podID="3785df35-68b7-4d28-8b4a-39c3136ce823" containerID="bf29f6df81130ede3f82b51b9d90996b98a4c3ab96fba8c55db9abbe16dc7d4b" exitCode=0 Mar 13 01:46:44.763796 master-0 kubenswrapper[19170]: I0313 01:46:44.763307 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerDied","Data":"bf29f6df81130ede3f82b51b9d90996b98a4c3ab96fba8c55db9abbe16dc7d4b"} Mar 13 01:46:44.768093 master-0 kubenswrapper[19170]: I0313 01:46:44.765883 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c4801b65-cb4e-4393-9a89-3a29c3051310","Type":"ContainerStarted","Data":"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98"} Mar 13 01:46:44.768093 master-0 kubenswrapper[19170]: I0313 01:46:44.766239 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-inspector-0" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="inspector-pxe-init" containerID="cri-o://661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98" gracePeriod=60 Mar 13 01:46:45.633942 master-0 kubenswrapper[19170]: I0313 01:46:45.633891 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:45.717236 master-0 kubenswrapper[19170]: I0313 01:46:45.717181 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.717422 master-0 kubenswrapper[19170]: I0313 01:46:45.717256 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd86r\" (UniqueName: \"kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.717458 master-0 kubenswrapper[19170]: I0313 01:46:45.717441 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.717527 master-0 kubenswrapper[19170]: I0313 01:46:45.717505 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.717570 master-0 kubenswrapper[19170]: I0313 01:46:45.717558 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.717885 master-0 kubenswrapper[19170]: I0313 01:46:45.717856 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:45.717943 master-0 kubenswrapper[19170]: I0313 01:46:45.717911 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.718056 master-0 kubenswrapper[19170]: I0313 01:46:45.718032 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle\") pod \"c4801b65-cb4e-4393-9a89-3a29c3051310\" (UID: \"c4801b65-cb4e-4393-9a89-3a29c3051310\") " Mar 13 01:46:45.718743 master-0 kubenswrapper[19170]: I0313 01:46:45.718720 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.721540 master-0 kubenswrapper[19170]: I0313 01:46:45.721489 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 01:46:45.721852 master-0 kubenswrapper[19170]: I0313 01:46:45.721567 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r" (OuterVolumeSpecName: "kube-api-access-vd86r") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "kube-api-access-vd86r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:45.722353 master-0 kubenswrapper[19170]: I0313 01:46:45.722298 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:46:45.722697 master-0 kubenswrapper[19170]: I0313 01:46:45.722661 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config" (OuterVolumeSpecName: "config") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:45.725468 master-0 kubenswrapper[19170]: I0313 01:46:45.725413 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts" (OuterVolumeSpecName: "scripts") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:45.777916 master-0 kubenswrapper[19170]: I0313 01:46:45.777868 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"d319028a0b7715619e67bfebe10050b21f6cad8979aacfb32242dc7f9cd75781"} Mar 13 01:46:45.781347 master-0 kubenswrapper[19170]: I0313 01:46:45.781318 19170 generic.go:334] "Generic (PLEG): container finished" podID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerID="661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98" exitCode=0 Mar 13 01:46:45.781441 master-0 kubenswrapper[19170]: I0313 01:46:45.781352 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c4801b65-cb4e-4393-9a89-3a29c3051310","Type":"ContainerDied","Data":"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98"} Mar 13 01:46:45.781441 master-0 kubenswrapper[19170]: I0313 01:46:45.781373 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c4801b65-cb4e-4393-9a89-3a29c3051310","Type":"ContainerDied","Data":"2a302803233523a41c50fbd6d880168b1ee5a451cb78b5ee8b49cf2b6c918c8f"} Mar 13 01:46:45.781441 master-0 kubenswrapper[19170]: I0313 01:46:45.781373 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4801b65-cb4e-4393-9a89-3a29c3051310" (UID: "c4801b65-cb4e-4393-9a89-3a29c3051310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:46:45.781441 master-0 kubenswrapper[19170]: I0313 01:46:45.781392 19170 scope.go:117] "RemoveContainer" containerID="661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98" Mar 13 01:46:45.781741 master-0 kubenswrapper[19170]: I0313 01:46:45.781721 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824652 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824698 19170 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c4801b65-cb4e-4393-9a89-3a29c3051310-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824715 19170 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c4801b65-cb4e-4393-9a89-3a29c3051310-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824728 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824741 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c4801b65-cb4e-4393-9a89-3a29c3051310-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.825449 master-0 kubenswrapper[19170]: I0313 01:46:45.824759 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd86r\" (UniqueName: \"kubernetes.io/projected/c4801b65-cb4e-4393-9a89-3a29c3051310-kube-api-access-vd86r\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:45.841255 master-0 kubenswrapper[19170]: I0313 01:46:45.841164 19170 scope.go:117] "RemoveContainer" containerID="2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c" Mar 13 01:46:45.882764 master-0 kubenswrapper[19170]: I0313 01:46:45.882724 19170 scope.go:117] "RemoveContainer" containerID="661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98" Mar 13 01:46:45.885953 master-0 kubenswrapper[19170]: E0313 01:46:45.885918 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98\": container with ID starting with 661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98 not found: ID does not exist" containerID="661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98" Mar 13 01:46:45.886092 master-0 kubenswrapper[19170]: I0313 01:46:45.886043 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98"} err="failed to get container status \"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98\": rpc error: code = NotFound desc = could not find container \"661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98\": container with ID starting with 661f5f9d721b829903cc19e4366ef0060e13bf9b0d6056b051813f30c4a72a98 not found: ID does not exist" Mar 13 01:46:45.886181 master-0 kubenswrapper[19170]: I0313 01:46:45.886169 19170 scope.go:117] "RemoveContainer" containerID="2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c" Mar 13 01:46:45.888686 master-0 kubenswrapper[19170]: E0313 01:46:45.887354 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c\": container with ID starting with 2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c not found: ID does not exist" containerID="2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c" Mar 13 01:46:45.888686 master-0 kubenswrapper[19170]: I0313 01:46:45.887413 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c"} err="failed to get container status \"2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c\": rpc error: code = NotFound desc = could not find container \"2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c\": container with ID starting with 2c8e7613ad4108967d5042918beb6ead4ad18d464f984c4396b4f3e5e290de5c not found: ID does not exist" Mar 13 01:46:45.923651 master-0 kubenswrapper[19170]: I0313 01:46:45.918864 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:45.939977 master-0 kubenswrapper[19170]: I0313 01:46:45.935065 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.942233 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943565 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a58ff59-00a5-4bda-9e4b-205d2628eb32" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943583 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a58ff59-00a5-4bda-9e4b-205d2628eb32" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943597 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a8b821-9639-4744-8744-a38e694de2de" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943604 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a8b821-9639-4744-8744-a38e694de2de" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943645 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="ironic-python-agent-init" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943654 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="ironic-python-agent-init" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943685 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068e68e4-8b54-41da-b797-9eee422826b2" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943691 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="068e68e4-8b54-41da-b797-9eee422826b2" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943705 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb9dbc0-c8c7-428a-905d-5023a309a3ac" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943713 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb9dbc0-c8c7-428a-905d-5023a309a3ac" containerName="mariadb-account-create-update" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943728 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="inspector-pxe-init" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943735 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="inspector-pxe-init" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943749 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943757 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: E0313 01:46:45.943770 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4131c57a-5598-4c4f-b4d1-ee12368c9b85" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.943776 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="4131c57a-5598-4c4f-b4d1-ee12368c9b85" containerName="mariadb-database-create" Mar 13 01:46:45.944002 master-0 kubenswrapper[19170]: I0313 01:46:45.944021 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a58ff59-00a5-4bda-9e4b-205d2628eb32" containerName="mariadb-account-create-update" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944038 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a8b821-9639-4744-8744-a38e694de2de" containerName="mariadb-database-create" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944057 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" containerName="inspector-pxe-init" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944078 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="068e68e4-8b54-41da-b797-9eee422826b2" containerName="mariadb-account-create-update" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944089 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="55081cb7-5148-4dd9-8d2c-14cb7a5b7a57" containerName="mariadb-database-create" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944101 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="4131c57a-5598-4c4f-b4d1-ee12368c9b85" containerName="mariadb-database-create" Mar 13 01:46:45.944660 master-0 kubenswrapper[19170]: I0313 01:46:45.944114 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb9dbc0-c8c7-428a-905d-5023a309a3ac" containerName="mariadb-account-create-update" Mar 13 01:46:45.951647 master-0 kubenswrapper[19170]: I0313 01:46:45.948139 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:45.958650 master-0 kubenswrapper[19170]: I0313 01:46:45.953872 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 13 01:46:45.958650 master-0 kubenswrapper[19170]: I0313 01:46:45.954218 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 13 01:46:45.958650 master-0 kubenswrapper[19170]: I0313 01:46:45.954345 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 13 01:46:45.958650 master-0 kubenswrapper[19170]: I0313 01:46:45.954444 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 13 01:46:45.958650 master-0 kubenswrapper[19170]: I0313 01:46:45.954556 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 13 01:46:45.988680 master-0 kubenswrapper[19170]: I0313 01:46:45.974873 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029372 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029438 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-config\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029457 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029476 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029494 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rt86\" (UniqueName: \"kubernetes.io/projected/cd7ac114-df97-491b-bd70-4e731d8944ef-kube-api-access-6rt86\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029526 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029545 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-scripts\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029675 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.029823 master-0 kubenswrapper[19170]: I0313 01:46:46.029706 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ac114-df97-491b-bd70-4e731d8944ef-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.133332 master-0 kubenswrapper[19170]: I0313 01:46:46.133235 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.133566 master-0 kubenswrapper[19170]: I0313 01:46:46.133344 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-config\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.134511 master-0 kubenswrapper[19170]: I0313 01:46:46.134468 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.134694 master-0 kubenswrapper[19170]: I0313 01:46:46.134661 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.134775 master-0 kubenswrapper[19170]: I0313 01:46:46.134717 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rt86\" (UniqueName: \"kubernetes.io/projected/cd7ac114-df97-491b-bd70-4e731d8944ef-kube-api-access-6rt86\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.134826 master-0 kubenswrapper[19170]: I0313 01:46:46.134781 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.134907 master-0 kubenswrapper[19170]: I0313 01:46:46.134877 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-scripts\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.135198 master-0 kubenswrapper[19170]: I0313 01:46:46.135154 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.135267 master-0 kubenswrapper[19170]: I0313 01:46:46.135218 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ac114-df97-491b-bd70-4e731d8944ef-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.136598 master-0 kubenswrapper[19170]: I0313 01:46:46.136539 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.136867 master-0 kubenswrapper[19170]: I0313 01:46:46.136838 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/cd7ac114-df97-491b-bd70-4e731d8944ef-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.137307 master-0 kubenswrapper[19170]: I0313 01:46:46.137277 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.137479 master-0 kubenswrapper[19170]: I0313 01:46:46.137436 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.140913 master-0 kubenswrapper[19170]: I0313 01:46:46.140850 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd7ac114-df97-491b-bd70-4e731d8944ef-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.141895 master-0 kubenswrapper[19170]: I0313 01:46:46.141795 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-config\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.144238 master-0 kubenswrapper[19170]: I0313 01:46:46.143885 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-scripts\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.145311 master-0 kubenswrapper[19170]: I0313 01:46:46.145208 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd7ac114-df97-491b-bd70-4e731d8944ef-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.182710 master-0 kubenswrapper[19170]: I0313 01:46:46.180591 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rt86\" (UniqueName: \"kubernetes.io/projected/cd7ac114-df97-491b-bd70-4e731d8944ef-kube-api-access-6rt86\") pod \"ironic-inspector-0\" (UID: \"cd7ac114-df97-491b-bd70-4e731d8944ef\") " pod="openstack/ironic-inspector-0" Mar 13 01:46:46.276348 master-0 kubenswrapper[19170]: I0313 01:46:46.276284 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 13 01:46:46.337623 master-0 kubenswrapper[19170]: I0313 01:46:46.332802 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:46:46.470371 master-0 kubenswrapper[19170]: I0313 01:46:46.468361 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pmp2r"] Mar 13 01:46:46.472522 master-0 kubenswrapper[19170]: I0313 01:46:46.472485 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.477076 master-0 kubenswrapper[19170]: I0313 01:46:46.476493 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 01:46:46.477076 master-0 kubenswrapper[19170]: I0313 01:46:46.476950 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 01:46:46.498658 master-0 kubenswrapper[19170]: I0313 01:46:46.498591 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pmp2r"] Mar 13 01:46:46.509615 master-0 kubenswrapper[19170]: I0313 01:46:46.501797 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:46:46.509615 master-0 kubenswrapper[19170]: I0313 01:46:46.502002 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="dnsmasq-dns" containerID="cri-o://6a0c643881c9cd9e4fca984b71071c1158db3c62332f3e5c8f45ac5978ca7e5d" gracePeriod=10 Mar 13 01:46:46.580686 master-0 kubenswrapper[19170]: I0313 01:46:46.567962 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.580686 master-0 kubenswrapper[19170]: I0313 01:46:46.568027 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tws4c\" (UniqueName: \"kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.580686 master-0 kubenswrapper[19170]: I0313 01:46:46.568144 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.580686 master-0 kubenswrapper[19170]: I0313 01:46:46.568501 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.672933 master-0 kubenswrapper[19170]: I0313 01:46:46.672364 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.672933 master-0 kubenswrapper[19170]: I0313 01:46:46.672575 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.672933 master-0 kubenswrapper[19170]: I0313 01:46:46.672605 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tws4c\" (UniqueName: \"kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.672933 master-0 kubenswrapper[19170]: I0313 01:46:46.672690 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.678052 master-0 kubenswrapper[19170]: I0313 01:46:46.677960 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.679167 master-0 kubenswrapper[19170]: I0313 01:46:46.679142 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.680848 master-0 kubenswrapper[19170]: I0313 01:46:46.679870 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.730040 master-0 kubenswrapper[19170]: I0313 01:46:46.729684 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tws4c\" (UniqueName: \"kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c\") pod \"nova-cell0-conductor-db-sync-pmp2r\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:46.799462 master-0 kubenswrapper[19170]: I0313 01:46:46.799418 19170 generic.go:334] "Generic (PLEG): container finished" podID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerID="6a0c643881c9cd9e4fca984b71071c1158db3c62332f3e5c8f45ac5978ca7e5d" exitCode=0 Mar 13 01:46:46.800088 master-0 kubenswrapper[19170]: I0313 01:46:46.799484 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" event={"ID":"97b91c16-ada4-4c3a-8e56-bd4e6417ded3","Type":"ContainerDied","Data":"6a0c643881c9cd9e4fca984b71071c1158db3c62332f3e5c8f45ac5978ca7e5d"} Mar 13 01:46:46.904475 master-0 kubenswrapper[19170]: I0313 01:46:46.904305 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:46:47.097833 master-0 kubenswrapper[19170]: I0313 01:46:47.097782 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 13 01:46:47.298116 master-0 kubenswrapper[19170]: I0313 01:46:47.298063 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:46:47.414245 master-0 kubenswrapper[19170]: I0313 01:46:47.414194 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.414432 master-0 kubenswrapper[19170]: I0313 01:46:47.414350 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.414471 master-0 kubenswrapper[19170]: I0313 01:46:47.414437 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shwcn\" (UniqueName: \"kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.414509 master-0 kubenswrapper[19170]: I0313 01:46:47.414472 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.414509 master-0 kubenswrapper[19170]: I0313 01:46:47.414505 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.414580 master-0 kubenswrapper[19170]: I0313 01:46:47.414543 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc\") pod \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\" (UID: \"97b91c16-ada4-4c3a-8e56-bd4e6417ded3\") " Mar 13 01:46:47.433872 master-0 kubenswrapper[19170]: I0313 01:46:47.428839 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn" (OuterVolumeSpecName: "kube-api-access-shwcn") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "kube-api-access-shwcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:46:47.451779 master-0 kubenswrapper[19170]: I0313 01:46:47.451661 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4801b65-cb4e-4393-9a89-3a29c3051310" path="/var/lib/kubelet/pods/c4801b65-cb4e-4393-9a89-3a29c3051310/volumes" Mar 13 01:46:47.472104 master-0 kubenswrapper[19170]: W0313 01:46:47.471805 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586e6b82_eae1_44a6_a56a_c3be9f7856cb.slice/crio-2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c WatchSource:0}: Error finding container 2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c: Status 404 returned error can't find the container with id 2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c Mar 13 01:46:47.490652 master-0 kubenswrapper[19170]: I0313 01:46:47.487613 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pmp2r"] Mar 13 01:46:47.519342 master-0 kubenswrapper[19170]: I0313 01:46:47.519231 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shwcn\" (UniqueName: \"kubernetes.io/projected/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-kube-api-access-shwcn\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.519786 master-0 kubenswrapper[19170]: I0313 01:46:47.519765 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:47.544278 master-0 kubenswrapper[19170]: I0313 01:46:47.543966 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.544278 master-0 kubenswrapper[19170]: I0313 01:46:47.544022 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.578866 master-0 kubenswrapper[19170]: I0313 01:46:47.578768 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:47.597372 master-0 kubenswrapper[19170]: I0313 01:46:47.597325 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:47.603979 master-0 kubenswrapper[19170]: I0313 01:46:47.603945 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.607120 master-0 kubenswrapper[19170]: I0313 01:46:47.607082 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:47.608725 master-0 kubenswrapper[19170]: I0313 01:46:47.608627 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config" (OuterVolumeSpecName: "config") pod "97b91c16-ada4-4c3a-8e56-bd4e6417ded3" (UID: "97b91c16-ada4-4c3a-8e56-bd4e6417ded3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:46:47.622238 master-0 kubenswrapper[19170]: I0313 01:46:47.622173 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.622238 master-0 kubenswrapper[19170]: I0313 01:46:47.622211 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.622238 master-0 kubenswrapper[19170]: I0313 01:46:47.622222 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.622362 master-0 kubenswrapper[19170]: I0313 01:46:47.622243 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.622362 master-0 kubenswrapper[19170]: I0313 01:46:47.622253 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97b91c16-ada4-4c3a-8e56-bd4e6417ded3-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:46:47.626127 master-0 kubenswrapper[19170]: I0313 01:46:47.626076 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.820918 master-0 kubenswrapper[19170]: I0313 01:46:47.820813 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" event={"ID":"97b91c16-ada4-4c3a-8e56-bd4e6417ded3","Type":"ContainerDied","Data":"ec84eb6576222a3979a88aeb20e70b776c6615129b337bcac33878d99776c0b4"} Mar 13 01:46:47.820918 master-0 kubenswrapper[19170]: I0313 01:46:47.820876 19170 scope.go:117] "RemoveContainer" containerID="6a0c643881c9cd9e4fca984b71071c1158db3c62332f3e5c8f45ac5978ca7e5d" Mar 13 01:46:47.821431 master-0 kubenswrapper[19170]: I0313 01:46:47.820989 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9c57cd77c-64hkx" Mar 13 01:46:47.840134 master-0 kubenswrapper[19170]: I0313 01:46:47.840073 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" event={"ID":"586e6b82-eae1-44a6-a56a-c3be9f7856cb","Type":"ContainerStarted","Data":"2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c"} Mar 13 01:46:47.843210 master-0 kubenswrapper[19170]: I0313 01:46:47.842684 19170 generic.go:334] "Generic (PLEG): container finished" podID="cd7ac114-df97-491b-bd70-4e731d8944ef" containerID="640715dcd3d7968ca8324bdacb591118de38c6c05546e98c0df1d233865281cc" exitCode=0 Mar 13 01:46:47.843210 master-0 kubenswrapper[19170]: I0313 01:46:47.842828 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerDied","Data":"640715dcd3d7968ca8324bdacb591118de38c6c05546e98c0df1d233865281cc"} Mar 13 01:46:47.843210 master-0 kubenswrapper[19170]: I0313 01:46:47.842933 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"43d39d62fb0264120d0a106d488ddc8bfe94ede59a45f0b61953e91e39e74d98"} Mar 13 01:46:47.844017 master-0 kubenswrapper[19170]: I0313 01:46:47.843505 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.844017 master-0 kubenswrapper[19170]: I0313 01:46:47.843532 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:47.869754 master-0 kubenswrapper[19170]: I0313 01:46:47.869721 19170 scope.go:117] "RemoveContainer" containerID="35c745adaebf698d13d18c92ad1db570ace63e623fda99191e11659ac373fb96" Mar 13 01:46:47.935139 master-0 kubenswrapper[19170]: I0313 01:46:47.933122 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:46:47.942658 master-0 kubenswrapper[19170]: I0313 01:46:47.941859 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9c57cd77c-64hkx"] Mar 13 01:46:48.859586 master-0 kubenswrapper[19170]: I0313 01:46:48.859513 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"ebe985054d6f5998683600e1102d97760a776fc8f504fb0834877fb6566833ed"} Mar 13 01:46:49.440670 master-0 kubenswrapper[19170]: I0313 01:46:49.440597 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" path="/var/lib/kubelet/pods/97b91c16-ada4-4c3a-8e56-bd4e6417ded3/volumes" Mar 13 01:46:49.486449 master-0 kubenswrapper[19170]: I0313 01:46:49.481844 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:49.486449 master-0 kubenswrapper[19170]: I0313 01:46:49.481889 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:49.526104 master-0 kubenswrapper[19170]: I0313 01:46:49.525159 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:49.527346 master-0 kubenswrapper[19170]: I0313 01:46:49.527303 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:49.880547 master-0 kubenswrapper[19170]: I0313 01:46:49.879734 19170 generic.go:334] "Generic (PLEG): container finished" podID="cd7ac114-df97-491b-bd70-4e731d8944ef" containerID="ebe985054d6f5998683600e1102d97760a776fc8f504fb0834877fb6566833ed" exitCode=0 Mar 13 01:46:49.880547 master-0 kubenswrapper[19170]: I0313 01:46:49.880181 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerDied","Data":"ebe985054d6f5998683600e1102d97760a776fc8f504fb0834877fb6566833ed"} Mar 13 01:46:49.881248 master-0 kubenswrapper[19170]: I0313 01:46:49.880572 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:49.881248 master-0 kubenswrapper[19170]: I0313 01:46:49.880913 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:50.908168 master-0 kubenswrapper[19170]: I0313 01:46:50.906153 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"782849cecd7fe379b525bbd32b3c3225ca68a2dcb83b5b0653d4564797715d6f"} Mar 13 01:46:51.623663 master-0 kubenswrapper[19170]: I0313 01:46:51.619683 19170 trace.go:236] Trace[1810996037]: "Calculate volume metrics of cache for pod openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2wh5w" (13-Mar-2026 01:46:50.576) (total time: 1043ms): Mar 13 01:46:51.623663 master-0 kubenswrapper[19170]: Trace[1810996037]: [1.043193794s] [1.043193794s] END Mar 13 01:46:51.929613 master-0 kubenswrapper[19170]: I0313 01:46:51.929558 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"e5c714f25a798904418ad8b02dfe396157a70868ace5d72c91e79bba7440cce7"} Mar 13 01:46:52.072599 master-0 kubenswrapper[19170]: I0313 01:46:52.072299 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:52.072810 master-0 kubenswrapper[19170]: I0313 01:46:52.072610 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:46:52.074959 master-0 kubenswrapper[19170]: I0313 01:46:52.074894 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-external-api-0" Mar 13 01:46:52.105270 master-0 kubenswrapper[19170]: I0313 01:46:52.105082 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:52.105270 master-0 kubenswrapper[19170]: I0313 01:46:52.105228 19170 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 01:46:52.107788 master-0 kubenswrapper[19170]: I0313 01:46:52.107754 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b9844-default-internal-api-0" Mar 13 01:46:58.036155 master-0 kubenswrapper[19170]: I0313 01:46:58.036008 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" event={"ID":"586e6b82-eae1-44a6-a56a-c3be9f7856cb","Type":"ContainerStarted","Data":"b5cd857076505fffd99040f0c352c10e03350c949f4c76b4f8b776e06ecb5856"} Mar 13 01:46:58.062661 master-0 kubenswrapper[19170]: I0313 01:46:58.053723 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"554c1aad46e2ba311b427368d02e9370992ec0b4064aff374a18ea800647b50e"} Mar 13 01:46:58.077655 master-0 kubenswrapper[19170]: I0313 01:46:58.077063 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" podStartSLOduration=2.230128613 podStartE2EDuration="12.077043449s" podCreationTimestamp="2026-03-13 01:46:46 +0000 UTC" firstStartedPulling="2026-03-13 01:46:47.480499941 +0000 UTC m=+1668.288620891" lastFinishedPulling="2026-03-13 01:46:57.327414777 +0000 UTC m=+1678.135535727" observedRunningTime="2026-03-13 01:46:58.067461662 +0000 UTC m=+1678.875582622" watchObservedRunningTime="2026-03-13 01:46:58.077043449 +0000 UTC m=+1678.885164409" Mar 13 01:46:59.075159 master-0 kubenswrapper[19170]: I0313 01:46:59.074580 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"39fcb5a89f27d5d04c40fefa71da5c8a9f09974bdc3ddc9b65290bca23126386"} Mar 13 01:46:59.075159 master-0 kubenswrapper[19170]: I0313 01:46:59.074682 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"cd7ac114-df97-491b-bd70-4e731d8944ef","Type":"ContainerStarted","Data":"16d950965bdf7c9502ea4c6f5052e9879050642104a1e7f270f67d170d609195"} Mar 13 01:47:00.089063 master-0 kubenswrapper[19170]: I0313 01:47:00.089015 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 13 01:47:00.089063 master-0 kubenswrapper[19170]: I0313 01:47:00.089071 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 13 01:47:01.144704 master-0 kubenswrapper[19170]: I0313 01:47:01.144616 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 13 01:47:01.199249 master-0 kubenswrapper[19170]: I0313 01:47:01.198993 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=16.198969471 podStartE2EDuration="16.198969471s" podCreationTimestamp="2026-03-13 01:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:46:59.299546964 +0000 UTC m=+1680.107667924" watchObservedRunningTime="2026-03-13 01:47:01.198969471 +0000 UTC m=+1682.007090441" Mar 13 01:47:01.276602 master-0 kubenswrapper[19170]: I0313 01:47:01.276545 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 13 01:47:01.276967 master-0 kubenswrapper[19170]: I0313 01:47:01.276941 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 13 01:47:02.123387 master-0 kubenswrapper[19170]: I0313 01:47:02.123262 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 13 01:47:03.007506 master-0 kubenswrapper[19170]: I0313 01:47:03.007431 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-b9844-default-external-api-0" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-log" probeResult="failure" output="Get \"https://10.128.0.225:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:47:03.008049 master-0 kubenswrapper[19170]: I0313 01:47:03.007502 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-b9844-default-external-api-0" podUID="979e6bc4-2aa2-4326-b7f2-c45f50b41c28" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.128.0.225:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 01:47:06.277287 master-0 kubenswrapper[19170]: I0313 01:47:06.277209 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 13 01:47:06.277287 master-0 kubenswrapper[19170]: I0313 01:47:06.277303 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 13 01:47:06.297470 master-0 kubenswrapper[19170]: I0313 01:47:06.297415 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 13 01:47:06.300019 master-0 kubenswrapper[19170]: I0313 01:47:06.299974 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 13 01:47:07.199495 master-0 kubenswrapper[19170]: I0313 01:47:07.199391 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 13 01:47:07.202541 master-0 kubenswrapper[19170]: I0313 01:47:07.202461 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 13 01:47:14.288792 master-0 kubenswrapper[19170]: I0313 01:47:14.288739 19170 generic.go:334] "Generic (PLEG): container finished" podID="586e6b82-eae1-44a6-a56a-c3be9f7856cb" containerID="b5cd857076505fffd99040f0c352c10e03350c949f4c76b4f8b776e06ecb5856" exitCode=0 Mar 13 01:47:14.288792 master-0 kubenswrapper[19170]: I0313 01:47:14.288784 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" event={"ID":"586e6b82-eae1-44a6-a56a-c3be9f7856cb","Type":"ContainerDied","Data":"b5cd857076505fffd99040f0c352c10e03350c949f4c76b4f8b776e06ecb5856"} Mar 13 01:47:15.754595 master-0 kubenswrapper[19170]: I0313 01:47:15.754551 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:47:15.885991 master-0 kubenswrapper[19170]: I0313 01:47:15.885915 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle\") pod \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " Mar 13 01:47:15.886207 master-0 kubenswrapper[19170]: I0313 01:47:15.886192 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts\") pod \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " Mar 13 01:47:15.886493 master-0 kubenswrapper[19170]: I0313 01:47:15.886466 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tws4c\" (UniqueName: \"kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c\") pod \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " Mar 13 01:47:15.886540 master-0 kubenswrapper[19170]: I0313 01:47:15.886526 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data\") pod \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\" (UID: \"586e6b82-eae1-44a6-a56a-c3be9f7856cb\") " Mar 13 01:47:15.889883 master-0 kubenswrapper[19170]: I0313 01:47:15.889359 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts" (OuterVolumeSpecName: "scripts") pod "586e6b82-eae1-44a6-a56a-c3be9f7856cb" (UID: "586e6b82-eae1-44a6-a56a-c3be9f7856cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:15.892876 master-0 kubenswrapper[19170]: I0313 01:47:15.892786 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c" (OuterVolumeSpecName: "kube-api-access-tws4c") pod "586e6b82-eae1-44a6-a56a-c3be9f7856cb" (UID: "586e6b82-eae1-44a6-a56a-c3be9f7856cb"). InnerVolumeSpecName "kube-api-access-tws4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:15.927293 master-0 kubenswrapper[19170]: I0313 01:47:15.927221 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "586e6b82-eae1-44a6-a56a-c3be9f7856cb" (UID: "586e6b82-eae1-44a6-a56a-c3be9f7856cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:15.932353 master-0 kubenswrapper[19170]: I0313 01:47:15.932298 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data" (OuterVolumeSpecName: "config-data") pod "586e6b82-eae1-44a6-a56a-c3be9f7856cb" (UID: "586e6b82-eae1-44a6-a56a-c3be9f7856cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:15.996382 master-0 kubenswrapper[19170]: I0313 01:47:15.996293 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:15.996382 master-0 kubenswrapper[19170]: I0313 01:47:15.996367 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:15.996382 master-0 kubenswrapper[19170]: I0313 01:47:15.996380 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/586e6b82-eae1-44a6-a56a-c3be9f7856cb-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:15.996382 master-0 kubenswrapper[19170]: I0313 01:47:15.996390 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tws4c\" (UniqueName: \"kubernetes.io/projected/586e6b82-eae1-44a6-a56a-c3be9f7856cb-kube-api-access-tws4c\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:16.326009 master-0 kubenswrapper[19170]: I0313 01:47:16.325939 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" event={"ID":"586e6b82-eae1-44a6-a56a-c3be9f7856cb","Type":"ContainerDied","Data":"2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c"} Mar 13 01:47:16.326368 master-0 kubenswrapper[19170]: I0313 01:47:16.326338 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2d24ad6e49efa5d983c1e61d9198c11bafd7cc63a3e2a754f5c45642a7fb0c" Mar 13 01:47:16.326505 master-0 kubenswrapper[19170]: I0313 01:47:16.326060 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pmp2r" Mar 13 01:47:16.533116 master-0 kubenswrapper[19170]: I0313 01:47:16.533031 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 01:47:16.533727 master-0 kubenswrapper[19170]: E0313 01:47:16.533704 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="init" Mar 13 01:47:16.533727 master-0 kubenswrapper[19170]: I0313 01:47:16.533728 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="init" Mar 13 01:47:16.533856 master-0 kubenswrapper[19170]: E0313 01:47:16.533754 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586e6b82-eae1-44a6-a56a-c3be9f7856cb" containerName="nova-cell0-conductor-db-sync" Mar 13 01:47:16.533856 master-0 kubenswrapper[19170]: I0313 01:47:16.533765 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="586e6b82-eae1-44a6-a56a-c3be9f7856cb" containerName="nova-cell0-conductor-db-sync" Mar 13 01:47:16.533856 master-0 kubenswrapper[19170]: E0313 01:47:16.533791 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="dnsmasq-dns" Mar 13 01:47:16.533856 master-0 kubenswrapper[19170]: I0313 01:47:16.533800 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="dnsmasq-dns" Mar 13 01:47:16.534221 master-0 kubenswrapper[19170]: I0313 01:47:16.534169 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="97b91c16-ada4-4c3a-8e56-bd4e6417ded3" containerName="dnsmasq-dns" Mar 13 01:47:16.534305 master-0 kubenswrapper[19170]: I0313 01:47:16.534242 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="586e6b82-eae1-44a6-a56a-c3be9f7856cb" containerName="nova-cell0-conductor-db-sync" Mar 13 01:47:16.535379 master-0 kubenswrapper[19170]: I0313 01:47:16.535350 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.543117 master-0 kubenswrapper[19170]: I0313 01:47:16.543055 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 01:47:16.548573 master-0 kubenswrapper[19170]: I0313 01:47:16.548503 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 01:47:16.720602 master-0 kubenswrapper[19170]: I0313 01:47:16.720407 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/ab891161-7265-415a-8a9c-0faf72f5cea0-kube-api-access-l7sjb\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.720835 master-0 kubenswrapper[19170]: I0313 01:47:16.720792 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.721100 master-0 kubenswrapper[19170]: I0313 01:47:16.721080 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.823699 master-0 kubenswrapper[19170]: I0313 01:47:16.823587 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.824258 master-0 kubenswrapper[19170]: I0313 01:47:16.823843 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/ab891161-7265-415a-8a9c-0faf72f5cea0-kube-api-access-l7sjb\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.824258 master-0 kubenswrapper[19170]: I0313 01:47:16.824077 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.828247 master-0 kubenswrapper[19170]: I0313 01:47:16.828191 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.829100 master-0 kubenswrapper[19170]: I0313 01:47:16.829073 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab891161-7265-415a-8a9c-0faf72f5cea0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.852200 master-0 kubenswrapper[19170]: I0313 01:47:16.852069 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/ab891161-7265-415a-8a9c-0faf72f5cea0-kube-api-access-l7sjb\") pod \"nova-cell0-conductor-0\" (UID: \"ab891161-7265-415a-8a9c-0faf72f5cea0\") " pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:16.896728 master-0 kubenswrapper[19170]: I0313 01:47:16.896620 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:17.443587 master-0 kubenswrapper[19170]: W0313 01:47:17.443513 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab891161_7265_415a_8a9c_0faf72f5cea0.slice/crio-29f4e3923bc7775c5a27dc9f675290ba5f3255f29c4dabde25f2c4086e05d0bc WatchSource:0}: Error finding container 29f4e3923bc7775c5a27dc9f675290ba5f3255f29c4dabde25f2c4086e05d0bc: Status 404 returned error can't find the container with id 29f4e3923bc7775c5a27dc9f675290ba5f3255f29c4dabde25f2c4086e05d0bc Mar 13 01:47:17.450573 master-0 kubenswrapper[19170]: I0313 01:47:17.449926 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 01:47:18.353771 master-0 kubenswrapper[19170]: I0313 01:47:18.353706 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ab891161-7265-415a-8a9c-0faf72f5cea0","Type":"ContainerStarted","Data":"54524e8e0ddcc068296164d114b411980e67eeeb8ba171383e097fded608b701"} Mar 13 01:47:18.353771 master-0 kubenswrapper[19170]: I0313 01:47:18.353766 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ab891161-7265-415a-8a9c-0faf72f5cea0","Type":"ContainerStarted","Data":"29f4e3923bc7775c5a27dc9f675290ba5f3255f29c4dabde25f2c4086e05d0bc"} Mar 13 01:47:18.355277 master-0 kubenswrapper[19170]: I0313 01:47:18.355220 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:18.382053 master-0 kubenswrapper[19170]: I0313 01:47:18.381964 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.381944434 podStartE2EDuration="2.381944434s" podCreationTimestamp="2026-03-13 01:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:18.377204831 +0000 UTC m=+1699.185325801" watchObservedRunningTime="2026-03-13 01:47:18.381944434 +0000 UTC m=+1699.190065404" Mar 13 01:47:26.944245 master-0 kubenswrapper[19170]: I0313 01:47:26.944166 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 01:47:27.569183 master-0 kubenswrapper[19170]: I0313 01:47:27.569089 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-sjvgp"] Mar 13 01:47:27.570768 master-0 kubenswrapper[19170]: I0313 01:47:27.570721 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.585893 master-0 kubenswrapper[19170]: I0313 01:47:27.578051 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 01:47:27.585893 master-0 kubenswrapper[19170]: I0313 01:47:27.578258 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 01:47:27.585893 master-0 kubenswrapper[19170]: I0313 01:47:27.585354 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sjvgp"] Mar 13 01:47:27.647401 master-0 kubenswrapper[19170]: I0313 01:47:27.647343 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 13 01:47:27.649364 master-0 kubenswrapper[19170]: I0313 01:47:27.649330 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.652049 master-0 kubenswrapper[19170]: I0313 01:47:27.651987 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 13 01:47:27.660669 master-0 kubenswrapper[19170]: I0313 01:47:27.658811 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-768lh\" (UniqueName: \"kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.660669 master-0 kubenswrapper[19170]: I0313 01:47:27.658990 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.660669 master-0 kubenswrapper[19170]: I0313 01:47:27.659054 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.660669 master-0 kubenswrapper[19170]: I0313 01:47:27.659169 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.667611 master-0 kubenswrapper[19170]: I0313 01:47:27.667507 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 13 01:47:27.767662 master-0 kubenswrapper[19170]: I0313 01:47:27.765132 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.767662 master-0 kubenswrapper[19170]: I0313 01:47:27.765330 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:27.772959 master-0 kubenswrapper[19170]: I0313 01:47:27.770242 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:27.773161 master-0 kubenswrapper[19170]: I0313 01:47:27.773095 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.773658 master-0 kubenswrapper[19170]: I0313 01:47:27.773220 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.778822 master-0 kubenswrapper[19170]: I0313 01:47:27.778113 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-768lh\" (UniqueName: \"kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.778822 master-0 kubenswrapper[19170]: I0313 01:47:27.778561 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmdx\" (UniqueName: \"kubernetes.io/projected/e34869a6-9527-4873-9f0e-8b0e88e5d756-kube-api-access-9tmdx\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.778822 master-0 kubenswrapper[19170]: I0313 01:47:27.778764 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.779105 master-0 kubenswrapper[19170]: I0313 01:47:27.779031 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.788670 master-0 kubenswrapper[19170]: I0313 01:47:27.784701 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 01:47:27.788891 master-0 kubenswrapper[19170]: I0313 01:47:27.788856 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.799843 master-0 kubenswrapper[19170]: I0313 01:47:27.799785 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:27.802837 master-0 kubenswrapper[19170]: I0313 01:47:27.800781 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.802837 master-0 kubenswrapper[19170]: I0313 01:47:27.801842 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.812433 master-0 kubenswrapper[19170]: I0313 01:47:27.812390 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-768lh\" (UniqueName: \"kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh\") pod \"nova-cell0-cell-mapping-sjvgp\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.902144 master-0 kubenswrapper[19170]: I0313 01:47:27.896296 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919107 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmdx\" (UniqueName: \"kubernetes.io/projected/e34869a6-9527-4873-9f0e-8b0e88e5d756-kube-api-access-9tmdx\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919218 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919308 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919541 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919595 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44t7z\" (UniqueName: \"kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:27.927248 master-0 kubenswrapper[19170]: I0313 01:47:27.919675 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:27.928198 master-0 kubenswrapper[19170]: I0313 01:47:27.928164 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.930605 master-0 kubenswrapper[19170]: I0313 01:47:27.930555 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:47:27.931187 master-0 kubenswrapper[19170]: I0313 01:47:27.931141 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34869a6-9527-4873-9f0e-8b0e88e5d756-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.932077 master-0 kubenswrapper[19170]: I0313 01:47:27.932046 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:27.939893 master-0 kubenswrapper[19170]: I0313 01:47:27.937954 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 01:47:27.953124 master-0 kubenswrapper[19170]: I0313 01:47:27.953032 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:47:27.968653 master-0 kubenswrapper[19170]: I0313 01:47:27.966991 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:27.968653 master-0 kubenswrapper[19170]: I0313 01:47:27.968399 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmdx\" (UniqueName: \"kubernetes.io/projected/e34869a6-9527-4873-9f0e-8b0e88e5d756-kube-api-access-9tmdx\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"e34869a6-9527-4873-9f0e-8b0e88e5d756\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.980333 master-0 kubenswrapper[19170]: I0313 01:47:27.971217 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:27.980333 master-0 kubenswrapper[19170]: I0313 01:47:27.975362 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:27.987596 master-0 kubenswrapper[19170]: I0313 01:47:27.982864 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 01:47:28.011721 master-0 kubenswrapper[19170]: I0313 01:47:28.010825 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:28.027063 master-0 kubenswrapper[19170]: I0313 01:47:28.027005 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.027248 master-0 kubenswrapper[19170]: I0313 01:47:28.027157 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.027328 master-0 kubenswrapper[19170]: I0313 01:47:28.027310 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.027473 master-0 kubenswrapper[19170]: I0313 01:47:28.027452 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.027598 master-0 kubenswrapper[19170]: I0313 01:47:28.027570 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkdp\" (UniqueName: \"kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.028116 master-0 kubenswrapper[19170]: I0313 01:47:28.028098 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44t7z\" (UniqueName: \"kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.051251 master-0 kubenswrapper[19170]: I0313 01:47:28.051174 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.054225 master-0 kubenswrapper[19170]: I0313 01:47:28.054100 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:47:28.057101 master-0 kubenswrapper[19170]: I0313 01:47:28.056410 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:47:28.058848 master-0 kubenswrapper[19170]: I0313 01:47:28.058799 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 01:47:28.065143 master-0 kubenswrapper[19170]: I0313 01:47:28.065050 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.076756 master-0 kubenswrapper[19170]: I0313 01:47:28.076707 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44t7z\" (UniqueName: \"kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z\") pod \"nova-scheduler-0\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:28.156940 master-0 kubenswrapper[19170]: I0313 01:47:28.156819 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.157194 master-0 kubenswrapper[19170]: I0313 01:47:28.157160 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.157317 master-0 kubenswrapper[19170]: I0313 01:47:28.157304 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.163400 master-0 kubenswrapper[19170]: I0313 01:47:28.163222 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:47:28.164089 master-0 kubenswrapper[19170]: I0313 01:47:28.164048 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8c8j\" (UniqueName: \"kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.169080 master-0 kubenswrapper[19170]: I0313 01:47:28.169020 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.169482 master-0 kubenswrapper[19170]: I0313 01:47:28.169460 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.169610 master-0 kubenswrapper[19170]: I0313 01:47:28.169595 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.179210 master-0 kubenswrapper[19170]: I0313 01:47:28.175187 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.179886 master-0 kubenswrapper[19170]: I0313 01:47:28.179846 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4l9\" (UniqueName: \"kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.180150 master-0 kubenswrapper[19170]: I0313 01:47:28.180129 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkdp\" (UniqueName: \"kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.181007 master-0 kubenswrapper[19170]: I0313 01:47:28.180893 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.181427 master-0 kubenswrapper[19170]: I0313 01:47:28.181407 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.185350 master-0 kubenswrapper[19170]: I0313 01:47:28.185316 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.201348 master-0 kubenswrapper[19170]: I0313 01:47:28.195823 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:47:28.201348 master-0 kubenswrapper[19170]: I0313 01:47:28.197806 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.220404 master-0 kubenswrapper[19170]: I0313 01:47:28.218587 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:28.224738 master-0 kubenswrapper[19170]: I0313 01:47:28.223440 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkdp\" (UniqueName: \"kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp\") pod \"nova-cell1-novncproxy-0\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.257102 master-0 kubenswrapper[19170]: I0313 01:47:28.250151 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.288981 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289085 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7lml\" (UniqueName: \"kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289119 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4l9\" (UniqueName: \"kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289139 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289172 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289193 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289214 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289242 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289263 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289297 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289459 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289480 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289506 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.289563 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8c8j\" (UniqueName: \"kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.294132 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.294506 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.295539 master-0 kubenswrapper[19170]: I0313 01:47:28.295161 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.298056 master-0 kubenswrapper[19170]: I0313 01:47:28.298012 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.310086 master-0 kubenswrapper[19170]: I0313 01:47:28.309748 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.311171 master-0 kubenswrapper[19170]: I0313 01:47:28.310804 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4l9\" (UniqueName: \"kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9\") pod \"nova-metadata-0\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " pod="openstack/nova-metadata-0" Mar 13 01:47:28.313719 master-0 kubenswrapper[19170]: I0313 01:47:28.311441 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.313719 master-0 kubenswrapper[19170]: I0313 01:47:28.312019 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8c8j\" (UniqueName: \"kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j\") pod \"nova-api-0\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " pod="openstack/nova-api-0" Mar 13 01:47:28.393664 master-0 kubenswrapper[19170]: I0313 01:47:28.387226 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398558 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398754 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7lml\" (UniqueName: \"kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398783 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398818 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398835 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.399204 master-0 kubenswrapper[19170]: I0313 01:47:28.398863 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.402468 master-0 kubenswrapper[19170]: I0313 01:47:28.399681 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.402468 master-0 kubenswrapper[19170]: I0313 01:47:28.399714 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.402468 master-0 kubenswrapper[19170]: I0313 01:47:28.400586 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.402468 master-0 kubenswrapper[19170]: I0313 01:47:28.401214 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.410701 master-0 kubenswrapper[19170]: I0313 01:47:28.403875 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.410701 master-0 kubenswrapper[19170]: I0313 01:47:28.406230 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:28.435442 master-0 kubenswrapper[19170]: I0313 01:47:28.421206 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7lml\" (UniqueName: \"kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml\") pod \"dnsmasq-dns-58d8bd468f-mrpg6\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.455653 master-0 kubenswrapper[19170]: I0313 01:47:28.454698 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:47:28.600796 master-0 kubenswrapper[19170]: I0313 01:47:28.599806 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:28.803409 master-0 kubenswrapper[19170]: I0313 01:47:28.803376 19170 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:47:28.820943 master-0 kubenswrapper[19170]: I0313 01:47:28.807102 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 13 01:47:28.834007 master-0 kubenswrapper[19170]: I0313 01:47:28.831666 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sjvgp"] Mar 13 01:47:29.012546 master-0 kubenswrapper[19170]: I0313 01:47:29.012492 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zplhz"] Mar 13 01:47:29.017494 master-0 kubenswrapper[19170]: I0313 01:47:29.013972 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.018672 master-0 kubenswrapper[19170]: I0313 01:47:29.018614 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 01:47:29.028708 master-0 kubenswrapper[19170]: I0313 01:47:29.020982 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 01:47:29.052456 master-0 kubenswrapper[19170]: I0313 01:47:29.052032 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zplhz"] Mar 13 01:47:29.132653 master-0 kubenswrapper[19170]: I0313 01:47:29.131164 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:29.164604 master-0 kubenswrapper[19170]: I0313 01:47:29.164575 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57bd\" (UniqueName: \"kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.164839 master-0 kubenswrapper[19170]: I0313 01:47:29.164822 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.165144 master-0 kubenswrapper[19170]: I0313 01:47:29.165129 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.165446 master-0 kubenswrapper[19170]: I0313 01:47:29.165408 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.270368 master-0 kubenswrapper[19170]: I0313 01:47:29.268058 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.270723 master-0 kubenswrapper[19170]: I0313 01:47:29.270685 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57bd\" (UniqueName: \"kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.270934 master-0 kubenswrapper[19170]: I0313 01:47:29.270917 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.274532 master-0 kubenswrapper[19170]: I0313 01:47:29.274507 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.277677 master-0 kubenswrapper[19170]: I0313 01:47:29.275319 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.286672 master-0 kubenswrapper[19170]: I0313 01:47:29.278918 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.286672 master-0 kubenswrapper[19170]: I0313 01:47:29.271421 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.299569 master-0 kubenswrapper[19170]: I0313 01:47:29.292579 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57bd\" (UniqueName: \"kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd\") pod \"nova-cell1-conductor-db-sync-zplhz\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.388521 master-0 kubenswrapper[19170]: I0313 01:47:29.386870 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:29.456484 master-0 kubenswrapper[19170]: I0313 01:47:29.453687 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:47:29.456484 master-0 kubenswrapper[19170]: I0313 01:47:29.453729 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:29.471618 master-0 kubenswrapper[19170]: I0313 01:47:29.466808 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:47:29.548640 master-0 kubenswrapper[19170]: I0313 01:47:29.548405 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerStarted","Data":"714c6759c515ac0572dcecae630045e61f4f2653301a110ef251160b3d053108"} Mar 13 01:47:29.551896 master-0 kubenswrapper[19170]: I0313 01:47:29.551848 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0939bd46-550b-47b8-b313-eb2432087a7b","Type":"ContainerStarted","Data":"3a6cae12d57f3bfdf8262947312d76d741b7f00285e5dcb241bb793ea7f44bc5"} Mar 13 01:47:29.561851 master-0 kubenswrapper[19170]: I0313 01:47:29.561078 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sjvgp" event={"ID":"97438ca6-ad0a-438f-9409-317d90cd7dbb","Type":"ContainerStarted","Data":"24537a61dca5e441e27af06beeb9675059ae55d6dc199fc20ccb44657da9c60a"} Mar 13 01:47:29.561851 master-0 kubenswrapper[19170]: I0313 01:47:29.561136 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sjvgp" event={"ID":"97438ca6-ad0a-438f-9409-317d90cd7dbb","Type":"ContainerStarted","Data":"bcef800d6b14ceefb69a8c6ce9a9d627ff1a020ce165c4a5df33f2f3f4247ca0"} Mar 13 01:47:29.565168 master-0 kubenswrapper[19170]: I0313 01:47:29.563406 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30cf9c76-52a7-490e-a6f6-aa85a909bd2f","Type":"ContainerStarted","Data":"31fc9d188cd1a07582ead03719cf5acd4f0463f4b0cd7878e344d9a5e02c2960"} Mar 13 01:47:29.566344 master-0 kubenswrapper[19170]: I0313 01:47:29.566302 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:47:29.581583 master-0 kubenswrapper[19170]: I0313 01:47:29.581494 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-sjvgp" podStartSLOduration=2.58147243 podStartE2EDuration="2.58147243s" podCreationTimestamp="2026-03-13 01:47:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:29.576215883 +0000 UTC m=+1710.384336843" watchObservedRunningTime="2026-03-13 01:47:29.58147243 +0000 UTC m=+1710.389593390" Mar 13 01:47:29.582483 master-0 kubenswrapper[19170]: I0313 01:47:29.582439 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"e34869a6-9527-4873-9f0e-8b0e88e5d756","Type":"ContainerStarted","Data":"d05d79ade48aa1fefe46d83c2601e4bb32b22346d3de6efb4b86de409e6c95e5"} Mar 13 01:47:29.584502 master-0 kubenswrapper[19170]: I0313 01:47:29.584446 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerStarted","Data":"23a3fd701dcc5ce60aad883d84eed8380c40f438ce26ea16796762b61d8069fe"} Mar 13 01:47:29.965516 master-0 kubenswrapper[19170]: I0313 01:47:29.962564 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zplhz"] Mar 13 01:47:30.600600 master-0 kubenswrapper[19170]: I0313 01:47:30.600515 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zplhz" event={"ID":"5e43f0ed-2d83-4046-9ea9-f6335877b5d7","Type":"ContainerStarted","Data":"6e021a193757c68567696e2ff4e5b5a09462e634ad7945c58b10c10edc217b49"} Mar 13 01:47:30.600600 master-0 kubenswrapper[19170]: I0313 01:47:30.600578 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zplhz" event={"ID":"5e43f0ed-2d83-4046-9ea9-f6335877b5d7","Type":"ContainerStarted","Data":"300e9899dc8eaf79407243ff6b29752a3a4a91dac520e07ca734cdea7d3d8a22"} Mar 13 01:47:30.618345 master-0 kubenswrapper[19170]: I0313 01:47:30.618297 19170 generic.go:334] "Generic (PLEG): container finished" podID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerID="e4e50e30401c7daf6dbdb803eec069e8420f727e3372775f53e3ef92415269a4" exitCode=0 Mar 13 01:47:30.618554 master-0 kubenswrapper[19170]: I0313 01:47:30.618483 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" event={"ID":"3b8e39a7-f183-4541-82e3-fdcdc6936300","Type":"ContainerDied","Data":"e4e50e30401c7daf6dbdb803eec069e8420f727e3372775f53e3ef92415269a4"} Mar 13 01:47:30.618554 master-0 kubenswrapper[19170]: I0313 01:47:30.618523 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" event={"ID":"3b8e39a7-f183-4541-82e3-fdcdc6936300","Type":"ContainerStarted","Data":"3277f8844189125723e25cf659af9f9428374aa03e67229d1d432b156873f3fc"} Mar 13 01:47:30.664663 master-0 kubenswrapper[19170]: I0313 01:47:30.658885 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zplhz" podStartSLOduration=2.6588677 podStartE2EDuration="2.6588677s" podCreationTimestamp="2026-03-13 01:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:30.64098225 +0000 UTC m=+1711.449103220" watchObservedRunningTime="2026-03-13 01:47:30.6588677 +0000 UTC m=+1711.466988660" Mar 13 01:47:32.120538 master-0 kubenswrapper[19170]: I0313 01:47:32.120407 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:47:32.150448 master-0 kubenswrapper[19170]: I0313 01:47:32.150384 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:47:33.681114 master-0 kubenswrapper[19170]: I0313 01:47:33.681054 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerStarted","Data":"9ab855b6ccea0e4b50da87a089871f3e5ea7d3f3017fc6ef29512b8dc60220d4"} Mar 13 01:47:33.681114 master-0 kubenswrapper[19170]: I0313 01:47:33.681109 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerStarted","Data":"cf397ebb9ee1e6c87cd5016c642d8b71426433f2369e938527be051f3f0e9feb"} Mar 13 01:47:33.683858 master-0 kubenswrapper[19170]: I0313 01:47:33.683818 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0939bd46-550b-47b8-b313-eb2432087a7b","Type":"ContainerStarted","Data":"ba45c2533f49b25b22ef9b62119062c185c04b577134457f4ed3e740cc303c42"} Mar 13 01:47:33.684361 master-0 kubenswrapper[19170]: I0313 01:47:33.684084 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0939bd46-550b-47b8-b313-eb2432087a7b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ba45c2533f49b25b22ef9b62119062c185c04b577134457f4ed3e740cc303c42" gracePeriod=30 Mar 13 01:47:33.687201 master-0 kubenswrapper[19170]: I0313 01:47:33.687060 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" event={"ID":"3b8e39a7-f183-4541-82e3-fdcdc6936300","Type":"ContainerStarted","Data":"90e43b91d2a8801429ec9eb529c7980bb057d8cf813efa37bbf0b220d8777cb3"} Mar 13 01:47:33.688253 master-0 kubenswrapper[19170]: I0313 01:47:33.688221 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:33.689971 master-0 kubenswrapper[19170]: I0313 01:47:33.689794 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30cf9c76-52a7-490e-a6f6-aa85a909bd2f","Type":"ContainerStarted","Data":"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb"} Mar 13 01:47:33.693644 master-0 kubenswrapper[19170]: I0313 01:47:33.693274 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerStarted","Data":"22e64e73b9493a33b71085e3943f6667dcc1cbc6d4efbd747a8e5e4c0a9e5094"} Mar 13 01:47:33.693644 master-0 kubenswrapper[19170]: I0313 01:47:33.693323 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerStarted","Data":"b83bca1065165f118c928d0c7ce24fa85522bddbc1c93c14c2dad378eb87d1b3"} Mar 13 01:47:33.693644 master-0 kubenswrapper[19170]: I0313 01:47:33.693411 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-log" containerID="cri-o://b83bca1065165f118c928d0c7ce24fa85522bddbc1c93c14c2dad378eb87d1b3" gracePeriod=30 Mar 13 01:47:33.693644 master-0 kubenswrapper[19170]: I0313 01:47:33.693523 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-metadata" containerID="cri-o://22e64e73b9493a33b71085e3943f6667dcc1cbc6d4efbd747a8e5e4c0a9e5094" gracePeriod=30 Mar 13 01:47:33.719725 master-0 kubenswrapper[19170]: I0313 01:47:33.719648 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.420656985 podStartE2EDuration="6.719618332s" podCreationTimestamp="2026-03-13 01:47:27 +0000 UTC" firstStartedPulling="2026-03-13 01:47:29.452930779 +0000 UTC m=+1710.261051739" lastFinishedPulling="2026-03-13 01:47:32.751892126 +0000 UTC m=+1713.560013086" observedRunningTime="2026-03-13 01:47:33.715287551 +0000 UTC m=+1714.523408511" watchObservedRunningTime="2026-03-13 01:47:33.719618332 +0000 UTC m=+1714.527739292" Mar 13 01:47:33.755340 master-0 kubenswrapper[19170]: I0313 01:47:33.755241 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.4551976890000002 podStartE2EDuration="6.755218836s" podCreationTimestamp="2026-03-13 01:47:27 +0000 UTC" firstStartedPulling="2026-03-13 01:47:29.452545338 +0000 UTC m=+1710.260666298" lastFinishedPulling="2026-03-13 01:47:32.752566485 +0000 UTC m=+1713.560687445" observedRunningTime="2026-03-13 01:47:33.745739862 +0000 UTC m=+1714.553860822" watchObservedRunningTime="2026-03-13 01:47:33.755218836 +0000 UTC m=+1714.563339796" Mar 13 01:47:33.793678 master-0 kubenswrapper[19170]: I0313 01:47:33.782553 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.495970279 podStartE2EDuration="5.782533479s" podCreationTimestamp="2026-03-13 01:47:28 +0000 UTC" firstStartedPulling="2026-03-13 01:47:29.472361342 +0000 UTC m=+1710.280482302" lastFinishedPulling="2026-03-13 01:47:32.758924542 +0000 UTC m=+1713.567045502" observedRunningTime="2026-03-13 01:47:33.775775041 +0000 UTC m=+1714.583896001" watchObservedRunningTime="2026-03-13 01:47:33.782533479 +0000 UTC m=+1714.590654449" Mar 13 01:47:33.824260 master-0 kubenswrapper[19170]: I0313 01:47:33.824160 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.189041614 podStartE2EDuration="6.824138072s" podCreationTimestamp="2026-03-13 01:47:27 +0000 UTC" firstStartedPulling="2026-03-13 01:47:29.120741868 +0000 UTC m=+1709.928862828" lastFinishedPulling="2026-03-13 01:47:32.755838326 +0000 UTC m=+1713.563959286" observedRunningTime="2026-03-13 01:47:33.796386517 +0000 UTC m=+1714.604507477" watchObservedRunningTime="2026-03-13 01:47:33.824138072 +0000 UTC m=+1714.632259032" Mar 13 01:47:33.833231 master-0 kubenswrapper[19170]: I0313 01:47:33.831341 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" podStartSLOduration=5.831326093 podStartE2EDuration="5.831326093s" podCreationTimestamp="2026-03-13 01:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:33.819527843 +0000 UTC m=+1714.627648793" watchObservedRunningTime="2026-03-13 01:47:33.831326093 +0000 UTC m=+1714.639447053" Mar 13 01:47:34.709588 master-0 kubenswrapper[19170]: I0313 01:47:34.709532 19170 generic.go:334] "Generic (PLEG): container finished" podID="95380989-f152-40e9-82a4-3bc9c091a8db" containerID="b83bca1065165f118c928d0c7ce24fa85522bddbc1c93c14c2dad378eb87d1b3" exitCode=143 Mar 13 01:47:34.710610 master-0 kubenswrapper[19170]: I0313 01:47:34.710585 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerDied","Data":"b83bca1065165f118c928d0c7ce24fa85522bddbc1c93c14c2dad378eb87d1b3"} Mar 13 01:47:36.752182 master-0 kubenswrapper[19170]: I0313 01:47:36.752111 19170 generic.go:334] "Generic (PLEG): container finished" podID="97438ca6-ad0a-438f-9409-317d90cd7dbb" containerID="24537a61dca5e441e27af06beeb9675059ae55d6dc199fc20ccb44657da9c60a" exitCode=0 Mar 13 01:47:36.753212 master-0 kubenswrapper[19170]: I0313 01:47:36.752189 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sjvgp" event={"ID":"97438ca6-ad0a-438f-9409-317d90cd7dbb","Type":"ContainerDied","Data":"24537a61dca5e441e27af06beeb9675059ae55d6dc199fc20ccb44657da9c60a"} Mar 13 01:47:38.223022 master-0 kubenswrapper[19170]: I0313 01:47:38.222779 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 01:47:38.223022 master-0 kubenswrapper[19170]: I0313 01:47:38.222892 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 01:47:38.254167 master-0 kubenswrapper[19170]: I0313 01:47:38.253880 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 01:47:38.387866 master-0 kubenswrapper[19170]: I0313 01:47:38.387810 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:47:38.408123 master-0 kubenswrapper[19170]: I0313 01:47:38.408057 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:47:38.408123 master-0 kubenswrapper[19170]: I0313 01:47:38.408121 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:47:38.455699 master-0 kubenswrapper[19170]: I0313 01:47:38.455615 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:47:38.455699 master-0 kubenswrapper[19170]: I0313 01:47:38.455707 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:47:38.603165 master-0 kubenswrapper[19170]: I0313 01:47:38.602957 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:47:38.722376 master-0 kubenswrapper[19170]: I0313 01:47:38.722325 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:47:38.722613 master-0 kubenswrapper[19170]: I0313 01:47:38.722549 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="dnsmasq-dns" containerID="cri-o://6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5" gracePeriod=10 Mar 13 01:47:38.839518 master-0 kubenswrapper[19170]: I0313 01:47:38.839462 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 01:47:39.498439 master-0 kubenswrapper[19170]: I0313 01:47:39.493805 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:47:39.498439 master-0 kubenswrapper[19170]: I0313 01:47:39.494113 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.11:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:47:41.329659 master-0 kubenswrapper[19170]: I0313 01:47:41.328920 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.250:5353: connect: connection refused" Mar 13 01:47:43.232199 master-0 kubenswrapper[19170]: I0313 01:47:43.232147 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:43.380004 master-0 kubenswrapper[19170]: I0313 01:47:43.379587 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle\") pod \"97438ca6-ad0a-438f-9409-317d90cd7dbb\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " Mar 13 01:47:43.380004 master-0 kubenswrapper[19170]: I0313 01:47:43.379669 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts\") pod \"97438ca6-ad0a-438f-9409-317d90cd7dbb\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " Mar 13 01:47:43.380004 master-0 kubenswrapper[19170]: I0313 01:47:43.379748 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-768lh\" (UniqueName: \"kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh\") pod \"97438ca6-ad0a-438f-9409-317d90cd7dbb\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " Mar 13 01:47:43.380004 master-0 kubenswrapper[19170]: I0313 01:47:43.379785 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data\") pod \"97438ca6-ad0a-438f-9409-317d90cd7dbb\" (UID: \"97438ca6-ad0a-438f-9409-317d90cd7dbb\") " Mar 13 01:47:43.388863 master-0 kubenswrapper[19170]: I0313 01:47:43.388770 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh" (OuterVolumeSpecName: "kube-api-access-768lh") pod "97438ca6-ad0a-438f-9409-317d90cd7dbb" (UID: "97438ca6-ad0a-438f-9409-317d90cd7dbb"). InnerVolumeSpecName "kube-api-access-768lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:43.389183 master-0 kubenswrapper[19170]: I0313 01:47:43.389132 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts" (OuterVolumeSpecName: "scripts") pod "97438ca6-ad0a-438f-9409-317d90cd7dbb" (UID: "97438ca6-ad0a-438f-9409-317d90cd7dbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:43.408291 master-0 kubenswrapper[19170]: I0313 01:47:43.407378 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data" (OuterVolumeSpecName: "config-data") pod "97438ca6-ad0a-438f-9409-317d90cd7dbb" (UID: "97438ca6-ad0a-438f-9409-317d90cd7dbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:43.410957 master-0 kubenswrapper[19170]: I0313 01:47:43.410697 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97438ca6-ad0a-438f-9409-317d90cd7dbb" (UID: "97438ca6-ad0a-438f-9409-317d90cd7dbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:43.483272 master-0 kubenswrapper[19170]: I0313 01:47:43.483229 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.483272 master-0 kubenswrapper[19170]: I0313 01:47:43.483267 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.483417 master-0 kubenswrapper[19170]: I0313 01:47:43.483280 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-768lh\" (UniqueName: \"kubernetes.io/projected/97438ca6-ad0a-438f-9409-317d90cd7dbb-kube-api-access-768lh\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.483417 master-0 kubenswrapper[19170]: I0313 01:47:43.483291 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97438ca6-ad0a-438f-9409-317d90cd7dbb-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.586371 master-0 kubenswrapper[19170]: I0313 01:47:43.586329 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:47:43.687467 master-0 kubenswrapper[19170]: I0313 01:47:43.687400 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.687733 master-0 kubenswrapper[19170]: I0313 01:47:43.687477 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.687733 master-0 kubenswrapper[19170]: I0313 01:47:43.687660 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.687960 master-0 kubenswrapper[19170]: I0313 01:47:43.687929 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.688015 master-0 kubenswrapper[19170]: I0313 01:47:43.687965 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9h24\" (UniqueName: \"kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.688015 master-0 kubenswrapper[19170]: I0313 01:47:43.687990 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb\") pod \"6ad5e2c1-f186-4654-ad97-73019827ec1f\" (UID: \"6ad5e2c1-f186-4654-ad97-73019827ec1f\") " Mar 13 01:47:43.696713 master-0 kubenswrapper[19170]: I0313 01:47:43.696670 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24" (OuterVolumeSpecName: "kube-api-access-x9h24") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "kube-api-access-x9h24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:43.743835 master-0 kubenswrapper[19170]: I0313 01:47:43.743770 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:47:43.747283 master-0 kubenswrapper[19170]: I0313 01:47:43.747206 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:47:43.756160 master-0 kubenswrapper[19170]: I0313 01:47:43.756102 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config" (OuterVolumeSpecName: "config") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:47:43.756470 master-0 kubenswrapper[19170]: I0313 01:47:43.756422 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:47:43.768682 master-0 kubenswrapper[19170]: I0313 01:47:43.768608 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ad5e2c1-f186-4654-ad97-73019827ec1f" (UID: "6ad5e2c1-f186-4654-ad97-73019827ec1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:47:43.790982 master-0 kubenswrapper[19170]: I0313 01:47:43.790912 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.790982 master-0 kubenswrapper[19170]: I0313 01:47:43.790965 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9h24\" (UniqueName: \"kubernetes.io/projected/6ad5e2c1-f186-4654-ad97-73019827ec1f-kube-api-access-x9h24\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.791262 master-0 kubenswrapper[19170]: I0313 01:47:43.791122 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.791262 master-0 kubenswrapper[19170]: I0313 01:47:43.791139 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.791262 master-0 kubenswrapper[19170]: I0313 01:47:43.791172 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.791262 master-0 kubenswrapper[19170]: I0313 01:47:43.791182 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ad5e2c1-f186-4654-ad97-73019827ec1f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:43.878276 master-0 kubenswrapper[19170]: I0313 01:47:43.878186 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"e34869a6-9527-4873-9f0e-8b0e88e5d756","Type":"ContainerStarted","Data":"7297ea65e742aa3eb8f1ead527e943b6128dae9387b3d72c1ec27ded283eb212"} Mar 13 01:47:43.878494 master-0 kubenswrapper[19170]: I0313 01:47:43.878362 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:43.882951 master-0 kubenswrapper[19170]: I0313 01:47:43.882921 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sjvgp" Mar 13 01:47:43.883571 master-0 kubenswrapper[19170]: I0313 01:47:43.882936 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sjvgp" event={"ID":"97438ca6-ad0a-438f-9409-317d90cd7dbb","Type":"ContainerDied","Data":"bcef800d6b14ceefb69a8c6ce9a9d627ff1a020ce165c4a5df33f2f3f4247ca0"} Mar 13 01:47:43.883643 master-0 kubenswrapper[19170]: I0313 01:47:43.883583 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcef800d6b14ceefb69a8c6ce9a9d627ff1a020ce165c4a5df33f2f3f4247ca0" Mar 13 01:47:43.884780 master-0 kubenswrapper[19170]: I0313 01:47:43.884754 19170 generic.go:334] "Generic (PLEG): container finished" podID="5e43f0ed-2d83-4046-9ea9-f6335877b5d7" containerID="6e021a193757c68567696e2ff4e5b5a09462e634ad7945c58b10c10edc217b49" exitCode=0 Mar 13 01:47:43.884842 master-0 kubenswrapper[19170]: I0313 01:47:43.884802 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zplhz" event={"ID":"5e43f0ed-2d83-4046-9ea9-f6335877b5d7","Type":"ContainerDied","Data":"6e021a193757c68567696e2ff4e5b5a09462e634ad7945c58b10c10edc217b49"} Mar 13 01:47:43.886325 master-0 kubenswrapper[19170]: I0313 01:47:43.886287 19170 generic.go:334] "Generic (PLEG): container finished" podID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerID="6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5" exitCode=0 Mar 13 01:47:43.886411 master-0 kubenswrapper[19170]: I0313 01:47:43.886334 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" event={"ID":"6ad5e2c1-f186-4654-ad97-73019827ec1f","Type":"ContainerDied","Data":"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5"} Mar 13 01:47:43.886411 master-0 kubenswrapper[19170]: I0313 01:47:43.886353 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" event={"ID":"6ad5e2c1-f186-4654-ad97-73019827ec1f","Type":"ContainerDied","Data":"a8e88a67773817de3a9147ecf71f7a97719d4078aae8a416902a7892695d25cc"} Mar 13 01:47:43.886411 master-0 kubenswrapper[19170]: I0313 01:47:43.886369 19170 scope.go:117] "RemoveContainer" containerID="6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5" Mar 13 01:47:43.886535 master-0 kubenswrapper[19170]: I0313 01:47:43.886498 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb659fff7-csvmp" Mar 13 01:47:43.893265 master-0 kubenswrapper[19170]: I0313 01:47:43.893206 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerDied","Data":"d319028a0b7715619e67bfebe10050b21f6cad8979aacfb32242dc7f9cd75781"} Mar 13 01:47:43.898574 master-0 kubenswrapper[19170]: I0313 01:47:43.893172 19170 generic.go:334] "Generic (PLEG): container finished" podID="3785df35-68b7-4d28-8b4a-39c3136ce823" containerID="d319028a0b7715619e67bfebe10050b21f6cad8979aacfb32242dc7f9cd75781" exitCode=0 Mar 13 01:47:43.907087 master-0 kubenswrapper[19170]: I0313 01:47:43.907011 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=2.436580332 podStartE2EDuration="16.90699113s" podCreationTimestamp="2026-03-13 01:47:27 +0000 UTC" firstStartedPulling="2026-03-13 01:47:28.80332185 +0000 UTC m=+1709.611442810" lastFinishedPulling="2026-03-13 01:47:43.273732628 +0000 UTC m=+1724.081853608" observedRunningTime="2026-03-13 01:47:43.903685968 +0000 UTC m=+1724.711806928" watchObservedRunningTime="2026-03-13 01:47:43.90699113 +0000 UTC m=+1724.715112090" Mar 13 01:47:43.915744 master-0 kubenswrapper[19170]: I0313 01:47:43.915347 19170 scope.go:117] "RemoveContainer" containerID="3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f" Mar 13 01:47:43.936311 master-0 kubenswrapper[19170]: I0313 01:47:43.936257 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 13 01:47:43.967577 master-0 kubenswrapper[19170]: I0313 01:47:43.967455 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:47:43.970761 master-0 kubenswrapper[19170]: I0313 01:47:43.970713 19170 scope.go:117] "RemoveContainer" containerID="6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5" Mar 13 01:47:43.971595 master-0 kubenswrapper[19170]: E0313 01:47:43.971542 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5\": container with ID starting with 6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5 not found: ID does not exist" containerID="6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5" Mar 13 01:47:43.971595 master-0 kubenswrapper[19170]: I0313 01:47:43.971574 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5"} err="failed to get container status \"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5\": rpc error: code = NotFound desc = could not find container \"6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5\": container with ID starting with 6ffb84adba1ceb53c3022fea83e3bf4e46f9cc99955912570a146eefc16114f5 not found: ID does not exist" Mar 13 01:47:43.971595 master-0 kubenswrapper[19170]: I0313 01:47:43.971595 19170 scope.go:117] "RemoveContainer" containerID="3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f" Mar 13 01:47:43.972478 master-0 kubenswrapper[19170]: E0313 01:47:43.972448 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f\": container with ID starting with 3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f not found: ID does not exist" containerID="3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f" Mar 13 01:47:43.972478 master-0 kubenswrapper[19170]: I0313 01:47:43.972472 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f"} err="failed to get container status \"3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f\": rpc error: code = NotFound desc = could not find container \"3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f\": container with ID starting with 3d3cb55ea8897ea04d998738f71d02d274206c0fa60c7113f1c0b74c3a912f1f not found: ID does not exist" Mar 13 01:47:43.983559 master-0 kubenswrapper[19170]: I0313 01:47:43.983502 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb659fff7-csvmp"] Mar 13 01:47:44.459095 master-0 kubenswrapper[19170]: I0313 01:47:44.458743 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:44.459795 master-0 kubenswrapper[19170]: I0313 01:47:44.459165 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-log" containerID="cri-o://cf397ebb9ee1e6c87cd5016c642d8b71426433f2369e938527be051f3f0e9feb" gracePeriod=30 Mar 13 01:47:44.459866 master-0 kubenswrapper[19170]: I0313 01:47:44.459822 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-api" containerID="cri-o://9ab855b6ccea0e4b50da87a089871f3e5ea7d3f3017fc6ef29512b8dc60220d4" gracePeriod=30 Mar 13 01:47:44.474352 master-0 kubenswrapper[19170]: I0313 01:47:44.473657 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:44.474352 master-0 kubenswrapper[19170]: I0313 01:47:44.473872 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerName="nova-scheduler-scheduler" containerID="cri-o://b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" gracePeriod=30 Mar 13 01:47:44.926032 master-0 kubenswrapper[19170]: I0313 01:47:44.925975 19170 generic.go:334] "Generic (PLEG): container finished" podID="50d16291-5a6e-41fd-b5a5-94844221c494" containerID="cf397ebb9ee1e6c87cd5016c642d8b71426433f2369e938527be051f3f0e9feb" exitCode=143 Mar 13 01:47:44.926223 master-0 kubenswrapper[19170]: I0313 01:47:44.926047 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerDied","Data":"cf397ebb9ee1e6c87cd5016c642d8b71426433f2369e938527be051f3f0e9feb"} Mar 13 01:47:44.935902 master-0 kubenswrapper[19170]: I0313 01:47:44.935794 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"32b9416c96956fdfc0fdb760ea5939af6f5b1381952da437807ec825b7c724fa"} Mar 13 01:47:45.367873 master-0 kubenswrapper[19170]: I0313 01:47:45.367823 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:45.439413 master-0 kubenswrapper[19170]: I0313 01:47:45.439317 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" path="/var/lib/kubelet/pods/6ad5e2c1-f186-4654-ad97-73019827ec1f/volumes" Mar 13 01:47:45.480648 master-0 kubenswrapper[19170]: I0313 01:47:45.476625 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v57bd\" (UniqueName: \"kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd\") pod \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " Mar 13 01:47:45.483681 master-0 kubenswrapper[19170]: I0313 01:47:45.482019 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts\") pod \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " Mar 13 01:47:45.483681 master-0 kubenswrapper[19170]: I0313 01:47:45.482116 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle\") pod \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " Mar 13 01:47:45.483681 master-0 kubenswrapper[19170]: I0313 01:47:45.482184 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data\") pod \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\" (UID: \"5e43f0ed-2d83-4046-9ea9-f6335877b5d7\") " Mar 13 01:47:45.491879 master-0 kubenswrapper[19170]: I0313 01:47:45.485608 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts" (OuterVolumeSpecName: "scripts") pod "5e43f0ed-2d83-4046-9ea9-f6335877b5d7" (UID: "5e43f0ed-2d83-4046-9ea9-f6335877b5d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:45.492497 master-0 kubenswrapper[19170]: I0313 01:47:45.492431 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd" (OuterVolumeSpecName: "kube-api-access-v57bd") pod "5e43f0ed-2d83-4046-9ea9-f6335877b5d7" (UID: "5e43f0ed-2d83-4046-9ea9-f6335877b5d7"). InnerVolumeSpecName "kube-api-access-v57bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:45.510806 master-0 kubenswrapper[19170]: I0313 01:47:45.510765 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e43f0ed-2d83-4046-9ea9-f6335877b5d7" (UID: "5e43f0ed-2d83-4046-9ea9-f6335877b5d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:45.540741 master-0 kubenswrapper[19170]: I0313 01:47:45.540591 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data" (OuterVolumeSpecName: "config-data") pod "5e43f0ed-2d83-4046-9ea9-f6335877b5d7" (UID: "5e43f0ed-2d83-4046-9ea9-f6335877b5d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:45.586602 master-0 kubenswrapper[19170]: I0313 01:47:45.586570 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v57bd\" (UniqueName: \"kubernetes.io/projected/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-kube-api-access-v57bd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:45.588835 master-0 kubenswrapper[19170]: I0313 01:47:45.588821 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:45.588986 master-0 kubenswrapper[19170]: I0313 01:47:45.588974 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:45.589083 master-0 kubenswrapper[19170]: I0313 01:47:45.589073 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e43f0ed-2d83-4046-9ea9-f6335877b5d7-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:45.948522 master-0 kubenswrapper[19170]: I0313 01:47:45.948457 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zplhz" event={"ID":"5e43f0ed-2d83-4046-9ea9-f6335877b5d7","Type":"ContainerDied","Data":"300e9899dc8eaf79407243ff6b29752a3a4a91dac520e07ca734cdea7d3d8a22"} Mar 13 01:47:45.948522 master-0 kubenswrapper[19170]: I0313 01:47:45.948520 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="300e9899dc8eaf79407243ff6b29752a3a4a91dac520e07ca734cdea7d3d8a22" Mar 13 01:47:45.948799 master-0 kubenswrapper[19170]: I0313 01:47:45.948578 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zplhz" Mar 13 01:47:45.979679 master-0 kubenswrapper[19170]: I0313 01:47:45.978822 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"08ead1d344e2eebd173b071e3075247cbcd07d8ffd2e4d5bf7f3074e0189aae9"} Mar 13 01:47:45.979679 master-0 kubenswrapper[19170]: I0313 01:47:45.978883 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"3785df35-68b7-4d28-8b4a-39c3136ce823","Type":"ContainerStarted","Data":"4de41bd873783670d864c6c61a5caadd41a809ff5b66f048ca33f968c77c1876"} Mar 13 01:47:45.979679 master-0 kubenswrapper[19170]: I0313 01:47:45.978926 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 13 01:47:45.979679 master-0 kubenswrapper[19170]: I0313 01:47:45.978949 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 13 01:47:46.061833 master-0 kubenswrapper[19170]: I0313 01:47:46.061767 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=80.640985801 podStartE2EDuration="1m49.06174906s" podCreationTimestamp="2026-03-13 01:45:57 +0000 UTC" firstStartedPulling="2026-03-13 01:46:07.854231175 +0000 UTC m=+1628.662352125" lastFinishedPulling="2026-03-13 01:46:36.274994424 +0000 UTC m=+1657.083115384" observedRunningTime="2026-03-13 01:47:46.034273242 +0000 UTC m=+1726.842394222" watchObservedRunningTime="2026-03-13 01:47:46.06174906 +0000 UTC m=+1726.869870010" Mar 13 01:47:46.125069 master-0 kubenswrapper[19170]: I0313 01:47:46.125015 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 01:47:46.125600 master-0 kubenswrapper[19170]: E0313 01:47:46.125577 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="dnsmasq-dns" Mar 13 01:47:46.125600 master-0 kubenswrapper[19170]: I0313 01:47:46.125595 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="dnsmasq-dns" Mar 13 01:47:46.125699 master-0 kubenswrapper[19170]: E0313 01:47:46.125624 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97438ca6-ad0a-438f-9409-317d90cd7dbb" containerName="nova-manage" Mar 13 01:47:46.125699 master-0 kubenswrapper[19170]: I0313 01:47:46.125646 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="97438ca6-ad0a-438f-9409-317d90cd7dbb" containerName="nova-manage" Mar 13 01:47:46.125699 master-0 kubenswrapper[19170]: E0313 01:47:46.125677 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e43f0ed-2d83-4046-9ea9-f6335877b5d7" containerName="nova-cell1-conductor-db-sync" Mar 13 01:47:46.125699 master-0 kubenswrapper[19170]: I0313 01:47:46.125684 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e43f0ed-2d83-4046-9ea9-f6335877b5d7" containerName="nova-cell1-conductor-db-sync" Mar 13 01:47:46.125699 master-0 kubenswrapper[19170]: E0313 01:47:46.125699 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="init" Mar 13 01:47:46.125852 master-0 kubenswrapper[19170]: I0313 01:47:46.125707 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="init" Mar 13 01:47:46.126010 master-0 kubenswrapper[19170]: I0313 01:47:46.125990 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e43f0ed-2d83-4046-9ea9-f6335877b5d7" containerName="nova-cell1-conductor-db-sync" Mar 13 01:47:46.126049 master-0 kubenswrapper[19170]: I0313 01:47:46.126012 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="97438ca6-ad0a-438f-9409-317d90cd7dbb" containerName="nova-manage" Mar 13 01:47:46.126086 master-0 kubenswrapper[19170]: I0313 01:47:46.126050 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad5e2c1-f186-4654-ad97-73019827ec1f" containerName="dnsmasq-dns" Mar 13 01:47:46.126879 master-0 kubenswrapper[19170]: I0313 01:47:46.126857 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.129657 master-0 kubenswrapper[19170]: I0313 01:47:46.129461 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 01:47:46.151687 master-0 kubenswrapper[19170]: I0313 01:47:46.148847 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 01:47:46.206240 master-0 kubenswrapper[19170]: I0313 01:47:46.205972 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.206240 master-0 kubenswrapper[19170]: I0313 01:47:46.206093 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.206240 master-0 kubenswrapper[19170]: I0313 01:47:46.206191 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvfd\" (UniqueName: \"kubernetes.io/projected/292b7686-310b-48db-956b-31a93e2fced2-kube-api-access-7cvfd\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.308786 master-0 kubenswrapper[19170]: I0313 01:47:46.308699 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cvfd\" (UniqueName: \"kubernetes.io/projected/292b7686-310b-48db-956b-31a93e2fced2-kube-api-access-7cvfd\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.308995 master-0 kubenswrapper[19170]: I0313 01:47:46.308821 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.308995 master-0 kubenswrapper[19170]: I0313 01:47:46.308894 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.312769 master-0 kubenswrapper[19170]: I0313 01:47:46.312731 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.324025 master-0 kubenswrapper[19170]: I0313 01:47:46.323982 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cvfd\" (UniqueName: \"kubernetes.io/projected/292b7686-310b-48db-956b-31a93e2fced2-kube-api-access-7cvfd\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.327309 master-0 kubenswrapper[19170]: I0313 01:47:46.327270 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/292b7686-310b-48db-956b-31a93e2fced2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"292b7686-310b-48db-956b-31a93e2fced2\") " pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:46.507837 master-0 kubenswrapper[19170]: I0313 01:47:46.507586 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:47.034439 master-0 kubenswrapper[19170]: I0313 01:47:47.034374 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 01:47:47.038656 master-0 kubenswrapper[19170]: W0313 01:47:47.038561 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod292b7686_310b_48db_956b_31a93e2fced2.slice/crio-ab6a6e7ab68505933ab4b932d6d94d53720c122c3e0b9dc219b7d1027cea6424 WatchSource:0}: Error finding container ab6a6e7ab68505933ab4b932d6d94d53720c122c3e0b9dc219b7d1027cea6424: Status 404 returned error can't find the container with id ab6a6e7ab68505933ab4b932d6d94d53720c122c3e0b9dc219b7d1027cea6424 Mar 13 01:47:47.707734 master-0 kubenswrapper[19170]: I0313 01:47:47.707082 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 13 01:47:48.008671 master-0 kubenswrapper[19170]: I0313 01:47:48.008533 19170 generic.go:334] "Generic (PLEG): container finished" podID="50d16291-5a6e-41fd-b5a5-94844221c494" containerID="9ab855b6ccea0e4b50da87a089871f3e5ea7d3f3017fc6ef29512b8dc60220d4" exitCode=0 Mar 13 01:47:48.008671 master-0 kubenswrapper[19170]: I0313 01:47:48.008655 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerDied","Data":"9ab855b6ccea0e4b50da87a089871f3e5ea7d3f3017fc6ef29512b8dc60220d4"} Mar 13 01:47:48.011113 master-0 kubenswrapper[19170]: I0313 01:47:48.011066 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"292b7686-310b-48db-956b-31a93e2fced2","Type":"ContainerStarted","Data":"4378e7df844ce1684d23ee3b20af4d7110d7188b22acce368b70d0389a1f9c88"} Mar 13 01:47:48.011177 master-0 kubenswrapper[19170]: I0313 01:47:48.011117 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"292b7686-310b-48db-956b-31a93e2fced2","Type":"ContainerStarted","Data":"ab6a6e7ab68505933ab4b932d6d94d53720c122c3e0b9dc219b7d1027cea6424"} Mar 13 01:47:48.012678 master-0 kubenswrapper[19170]: I0313 01:47:48.011525 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:48.044883 master-0 kubenswrapper[19170]: I0313 01:47:48.042258 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.042231901 podStartE2EDuration="2.042231901s" podCreationTimestamp="2026-03-13 01:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:48.026173572 +0000 UTC m=+1728.834294542" watchObservedRunningTime="2026-03-13 01:47:48.042231901 +0000 UTC m=+1728.850352871" Mar 13 01:47:48.071411 master-0 kubenswrapper[19170]: I0313 01:47:48.071362 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 13 01:47:48.154896 master-0 kubenswrapper[19170]: I0313 01:47:48.154846 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:48.227066 master-0 kubenswrapper[19170]: E0313 01:47:48.224065 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 01:47:48.227066 master-0 kubenswrapper[19170]: E0313 01:47:48.225684 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 01:47:48.227312 master-0 kubenswrapper[19170]: E0313 01:47:48.227109 19170 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 01:47:48.227312 master-0 kubenswrapper[19170]: E0313 01:47:48.227143 19170 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerName="nova-scheduler-scheduler" Mar 13 01:47:48.265481 master-0 kubenswrapper[19170]: I0313 01:47:48.265351 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle\") pod \"50d16291-5a6e-41fd-b5a5-94844221c494\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " Mar 13 01:47:48.265689 master-0 kubenswrapper[19170]: I0313 01:47:48.265563 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs\") pod \"50d16291-5a6e-41fd-b5a5-94844221c494\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " Mar 13 01:47:48.265689 master-0 kubenswrapper[19170]: I0313 01:47:48.265597 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data\") pod \"50d16291-5a6e-41fd-b5a5-94844221c494\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " Mar 13 01:47:48.265786 master-0 kubenswrapper[19170]: I0313 01:47:48.265688 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8c8j\" (UniqueName: \"kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j\") pod \"50d16291-5a6e-41fd-b5a5-94844221c494\" (UID: \"50d16291-5a6e-41fd-b5a5-94844221c494\") " Mar 13 01:47:48.266127 master-0 kubenswrapper[19170]: I0313 01:47:48.266085 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs" (OuterVolumeSpecName: "logs") pod "50d16291-5a6e-41fd-b5a5-94844221c494" (UID: "50d16291-5a6e-41fd-b5a5-94844221c494"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:47:48.267228 master-0 kubenswrapper[19170]: I0313 01:47:48.267202 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50d16291-5a6e-41fd-b5a5-94844221c494-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:48.268795 master-0 kubenswrapper[19170]: I0313 01:47:48.268759 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j" (OuterVolumeSpecName: "kube-api-access-l8c8j") pod "50d16291-5a6e-41fd-b5a5-94844221c494" (UID: "50d16291-5a6e-41fd-b5a5-94844221c494"). InnerVolumeSpecName "kube-api-access-l8c8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:48.298115 master-0 kubenswrapper[19170]: I0313 01:47:48.298055 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data" (OuterVolumeSpecName: "config-data") pod "50d16291-5a6e-41fd-b5a5-94844221c494" (UID: "50d16291-5a6e-41fd-b5a5-94844221c494"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:48.299454 master-0 kubenswrapper[19170]: I0313 01:47:48.299402 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50d16291-5a6e-41fd-b5a5-94844221c494" (UID: "50d16291-5a6e-41fd-b5a5-94844221c494"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:48.369702 master-0 kubenswrapper[19170]: I0313 01:47:48.369644 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:48.369702 master-0 kubenswrapper[19170]: I0313 01:47:48.369693 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d16291-5a6e-41fd-b5a5-94844221c494-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:48.369702 master-0 kubenswrapper[19170]: I0313 01:47:48.369708 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8c8j\" (UniqueName: \"kubernetes.io/projected/50d16291-5a6e-41fd-b5a5-94844221c494-kube-api-access-l8c8j\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:49.042403 master-0 kubenswrapper[19170]: I0313 01:47:49.042342 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:49.046316 master-0 kubenswrapper[19170]: I0313 01:47:49.042529 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"50d16291-5a6e-41fd-b5a5-94844221c494","Type":"ContainerDied","Data":"714c6759c515ac0572dcecae630045e61f4f2653301a110ef251160b3d053108"} Mar 13 01:47:49.046537 master-0 kubenswrapper[19170]: I0313 01:47:49.046502 19170 scope.go:117] "RemoveContainer" containerID="9ab855b6ccea0e4b50da87a089871f3e5ea7d3f3017fc6ef29512b8dc60220d4" Mar 13 01:47:49.054437 master-0 kubenswrapper[19170]: I0313 01:47:49.054136 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 13 01:47:49.096214 master-0 kubenswrapper[19170]: I0313 01:47:49.095847 19170 scope.go:117] "RemoveContainer" containerID="cf397ebb9ee1e6c87cd5016c642d8b71426433f2369e938527be051f3f0e9feb" Mar 13 01:47:49.146738 master-0 kubenswrapper[19170]: I0313 01:47:49.146672 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:49.164659 master-0 kubenswrapper[19170]: I0313 01:47:49.163407 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:49.180316 master-0 kubenswrapper[19170]: I0313 01:47:49.180247 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:49.180974 master-0 kubenswrapper[19170]: E0313 01:47:49.180936 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-log" Mar 13 01:47:49.181031 master-0 kubenswrapper[19170]: I0313 01:47:49.180979 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-log" Mar 13 01:47:49.181095 master-0 kubenswrapper[19170]: E0313 01:47:49.181079 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-api" Mar 13 01:47:49.181131 master-0 kubenswrapper[19170]: I0313 01:47:49.181095 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-api" Mar 13 01:47:49.181434 master-0 kubenswrapper[19170]: I0313 01:47:49.181405 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-log" Mar 13 01:47:49.181478 master-0 kubenswrapper[19170]: I0313 01:47:49.181439 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" containerName="nova-api-api" Mar 13 01:47:49.183064 master-0 kubenswrapper[19170]: I0313 01:47:49.183024 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:49.187220 master-0 kubenswrapper[19170]: I0313 01:47:49.187034 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 01:47:49.191447 master-0 kubenswrapper[19170]: I0313 01:47:49.191391 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:49.294434 master-0 kubenswrapper[19170]: I0313 01:47:49.294267 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4jw5\" (UniqueName: \"kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.294645 master-0 kubenswrapper[19170]: I0313 01:47:49.294474 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.294874 master-0 kubenswrapper[19170]: I0313 01:47:49.294828 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.294948 master-0 kubenswrapper[19170]: I0313 01:47:49.294921 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.397847 master-0 kubenswrapper[19170]: I0313 01:47:49.397566 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.397847 master-0 kubenswrapper[19170]: I0313 01:47:49.397778 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4jw5\" (UniqueName: \"kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.398260 master-0 kubenswrapper[19170]: I0313 01:47:49.397987 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.398260 master-0 kubenswrapper[19170]: I0313 01:47:49.398166 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.399543 master-0 kubenswrapper[19170]: I0313 01:47:49.399489 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.407274 master-0 kubenswrapper[19170]: I0313 01:47:49.407233 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.408312 master-0 kubenswrapper[19170]: I0313 01:47:49.408244 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.420604 master-0 kubenswrapper[19170]: I0313 01:47:49.420514 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4jw5\" (UniqueName: \"kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5\") pod \"nova-api-0\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " pod="openstack/nova-api-0" Mar 13 01:47:49.454425 master-0 kubenswrapper[19170]: I0313 01:47:49.454325 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d16291-5a6e-41fd-b5a5-94844221c494" path="/var/lib/kubelet/pods/50d16291-5a6e-41fd-b5a5-94844221c494/volumes" Mar 13 01:47:49.531361 master-0 kubenswrapper[19170]: I0313 01:47:49.531289 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:47:49.999721 master-0 kubenswrapper[19170]: I0313 01:47:49.999687 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:50.084464 master-0 kubenswrapper[19170]: I0313 01:47:50.083197 19170 generic.go:334] "Generic (PLEG): container finished" podID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" exitCode=0 Mar 13 01:47:50.084464 master-0 kubenswrapper[19170]: I0313 01:47:50.083517 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30cf9c76-52a7-490e-a6f6-aa85a909bd2f","Type":"ContainerDied","Data":"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb"} Mar 13 01:47:50.084464 master-0 kubenswrapper[19170]: I0313 01:47:50.083747 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30cf9c76-52a7-490e-a6f6-aa85a909bd2f","Type":"ContainerDied","Data":"31fc9d188cd1a07582ead03719cf5acd4f0463f4b0cd7878e344d9a5e02c2960"} Mar 13 01:47:50.084464 master-0 kubenswrapper[19170]: I0313 01:47:50.083795 19170 scope.go:117] "RemoveContainer" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" Mar 13 01:47:50.085080 master-0 kubenswrapper[19170]: I0313 01:47:50.083826 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:50.091932 master-0 kubenswrapper[19170]: I0313 01:47:50.091877 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 13 01:47:50.114234 master-0 kubenswrapper[19170]: I0313 01:47:50.114130 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data\") pod \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " Mar 13 01:47:50.114789 master-0 kubenswrapper[19170]: I0313 01:47:50.114732 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle\") pod \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " Mar 13 01:47:50.114906 master-0 kubenswrapper[19170]: I0313 01:47:50.114815 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44t7z\" (UniqueName: \"kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z\") pod \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\" (UID: \"30cf9c76-52a7-490e-a6f6-aa85a909bd2f\") " Mar 13 01:47:50.136864 master-0 kubenswrapper[19170]: I0313 01:47:50.129900 19170 scope.go:117] "RemoveContainer" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" Mar 13 01:47:50.136864 master-0 kubenswrapper[19170]: E0313 01:47:50.130876 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb\": container with ID starting with b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb not found: ID does not exist" containerID="b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb" Mar 13 01:47:50.136864 master-0 kubenswrapper[19170]: I0313 01:47:50.130968 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb"} err="failed to get container status \"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb\": rpc error: code = NotFound desc = could not find container \"b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb\": container with ID starting with b0024e27c8d6f14861295fdfdb760931d35616d615bd94daec5d348177e002fb not found: ID does not exist" Mar 13 01:47:50.152958 master-0 kubenswrapper[19170]: I0313 01:47:50.149788 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data" (OuterVolumeSpecName: "config-data") pod "30cf9c76-52a7-490e-a6f6-aa85a909bd2f" (UID: "30cf9c76-52a7-490e-a6f6-aa85a909bd2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:50.156083 master-0 kubenswrapper[19170]: I0313 01:47:50.156040 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z" (OuterVolumeSpecName: "kube-api-access-44t7z") pod "30cf9c76-52a7-490e-a6f6-aa85a909bd2f" (UID: "30cf9c76-52a7-490e-a6f6-aa85a909bd2f"). InnerVolumeSpecName "kube-api-access-44t7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:47:50.159950 master-0 kubenswrapper[19170]: I0313 01:47:50.159898 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30cf9c76-52a7-490e-a6f6-aa85a909bd2f" (UID: "30cf9c76-52a7-490e-a6f6-aa85a909bd2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:47:50.178085 master-0 kubenswrapper[19170]: W0313 01:47:50.178028 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf61f4a6e_39e1_426c_9b4d_5a9535df29d8.slice/crio-9f43025aa3f2ff1e24a32c13ae0d7b92269e3f8e37b36dce4f2d3a698e26855f WatchSource:0}: Error finding container 9f43025aa3f2ff1e24a32c13ae0d7b92269e3f8e37b36dce4f2d3a698e26855f: Status 404 returned error can't find the container with id 9f43025aa3f2ff1e24a32c13ae0d7b92269e3f8e37b36dce4f2d3a698e26855f Mar 13 01:47:50.191300 master-0 kubenswrapper[19170]: I0313 01:47:50.186145 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:47:50.233747 master-0 kubenswrapper[19170]: I0313 01:47:50.233619 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:50.233747 master-0 kubenswrapper[19170]: I0313 01:47:50.233681 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44t7z\" (UniqueName: \"kubernetes.io/projected/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-kube-api-access-44t7z\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:50.233747 master-0 kubenswrapper[19170]: I0313 01:47:50.233693 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cf9c76-52a7-490e-a6f6-aa85a909bd2f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:47:50.567675 master-0 kubenswrapper[19170]: I0313 01:47:50.566664 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:50.594186 master-0 kubenswrapper[19170]: I0313 01:47:50.586321 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:50.600674 master-0 kubenswrapper[19170]: I0313 01:47:50.599968 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:50.600674 master-0 kubenswrapper[19170]: E0313 01:47:50.600584 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerName="nova-scheduler-scheduler" Mar 13 01:47:50.600674 master-0 kubenswrapper[19170]: I0313 01:47:50.600599 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerName="nova-scheduler-scheduler" Mar 13 01:47:50.600973 master-0 kubenswrapper[19170]: I0313 01:47:50.600867 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" containerName="nova-scheduler-scheduler" Mar 13 01:47:50.607138 master-0 kubenswrapper[19170]: I0313 01:47:50.601586 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:50.607138 master-0 kubenswrapper[19170]: I0313 01:47:50.604162 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 01:47:50.614664 master-0 kubenswrapper[19170]: I0313 01:47:50.613963 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:50.745511 master-0 kubenswrapper[19170]: I0313 01:47:50.745446 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wqz\" (UniqueName: \"kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.745742 master-0 kubenswrapper[19170]: I0313 01:47:50.745585 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.745742 master-0 kubenswrapper[19170]: I0313 01:47:50.745700 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.848780 master-0 kubenswrapper[19170]: I0313 01:47:50.848390 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wqz\" (UniqueName: \"kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.848780 master-0 kubenswrapper[19170]: I0313 01:47:50.848595 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.848780 master-0 kubenswrapper[19170]: I0313 01:47:50.848718 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.854417 master-0 kubenswrapper[19170]: I0313 01:47:50.851878 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.862674 master-0 kubenswrapper[19170]: I0313 01:47:50.861618 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.873755 master-0 kubenswrapper[19170]: I0313 01:47:50.871504 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wqz\" (UniqueName: \"kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz\") pod \"nova-scheduler-0\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " pod="openstack/nova-scheduler-0" Mar 13 01:47:50.933061 master-0 kubenswrapper[19170]: I0313 01:47:50.933001 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:47:51.115923 master-0 kubenswrapper[19170]: I0313 01:47:51.114653 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerStarted","Data":"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835"} Mar 13 01:47:51.115923 master-0 kubenswrapper[19170]: I0313 01:47:51.114712 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerStarted","Data":"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a"} Mar 13 01:47:51.115923 master-0 kubenswrapper[19170]: I0313 01:47:51.114722 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerStarted","Data":"9f43025aa3f2ff1e24a32c13ae0d7b92269e3f8e37b36dce4f2d3a698e26855f"} Mar 13 01:47:51.151912 master-0 kubenswrapper[19170]: I0313 01:47:51.151802 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.151779167 podStartE2EDuration="2.151779167s" podCreationTimestamp="2026-03-13 01:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:51.136181771 +0000 UTC m=+1731.944302741" watchObservedRunningTime="2026-03-13 01:47:51.151779167 +0000 UTC m=+1731.959900127" Mar 13 01:47:51.374224 master-0 kubenswrapper[19170]: I0313 01:47:51.374157 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:47:51.438207 master-0 kubenswrapper[19170]: I0313 01:47:51.438059 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30cf9c76-52a7-490e-a6f6-aa85a909bd2f" path="/var/lib/kubelet/pods/30cf9c76-52a7-490e-a6f6-aa85a909bd2f/volumes" Mar 13 01:47:52.137651 master-0 kubenswrapper[19170]: I0313 01:47:52.136813 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04fabfa9-7306-4692-8301-9ea765f3a452","Type":"ContainerStarted","Data":"c315c477c537d361ab588a3b2eec0dc871612e75d623972f6446450f312f8d34"} Mar 13 01:47:52.137651 master-0 kubenswrapper[19170]: I0313 01:47:52.136870 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04fabfa9-7306-4692-8301-9ea765f3a452","Type":"ContainerStarted","Data":"875f42b1da0d52586671c14734f17421d02b6667ff94389b4c44aa55bedb42bf"} Mar 13 01:47:52.181661 master-0 kubenswrapper[19170]: I0313 01:47:52.181288 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.181264838 podStartE2EDuration="2.181264838s" podCreationTimestamp="2026-03-13 01:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:47:52.164976863 +0000 UTC m=+1732.973097863" watchObservedRunningTime="2026-03-13 01:47:52.181264838 +0000 UTC m=+1732.989385808" Mar 13 01:47:55.933854 master-0 kubenswrapper[19170]: I0313 01:47:55.933746 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 01:47:56.557835 master-0 kubenswrapper[19170]: I0313 01:47:56.557784 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 01:47:59.532485 master-0 kubenswrapper[19170]: I0313 01:47:59.532164 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:47:59.532485 master-0 kubenswrapper[19170]: I0313 01:47:59.532245 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:48:00.613983 master-0 kubenswrapper[19170]: I0313 01:48:00.613885 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.16:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:00.613983 master-0 kubenswrapper[19170]: I0313 01:48:00.613922 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.16:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:00.934030 master-0 kubenswrapper[19170]: I0313 01:48:00.933959 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 01:48:01.007444 master-0 kubenswrapper[19170]: I0313 01:48:01.007387 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 01:48:01.345052 master-0 kubenswrapper[19170]: I0313 01:48:01.344932 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 01:48:04.348584 master-0 kubenswrapper[19170]: I0313 01:48:04.348500 19170 generic.go:334] "Generic (PLEG): container finished" podID="95380989-f152-40e9-82a4-3bc9c091a8db" containerID="22e64e73b9493a33b71085e3943f6667dcc1cbc6d4efbd747a8e5e4c0a9e5094" exitCode=137 Mar 13 01:48:04.349267 master-0 kubenswrapper[19170]: I0313 01:48:04.348587 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerDied","Data":"22e64e73b9493a33b71085e3943f6667dcc1cbc6d4efbd747a8e5e4c0a9e5094"} Mar 13 01:48:04.349267 master-0 kubenswrapper[19170]: I0313 01:48:04.348809 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"95380989-f152-40e9-82a4-3bc9c091a8db","Type":"ContainerDied","Data":"23a3fd701dcc5ce60aad883d84eed8380c40f438ce26ea16796762b61d8069fe"} Mar 13 01:48:04.349267 master-0 kubenswrapper[19170]: I0313 01:48:04.348838 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23a3fd701dcc5ce60aad883d84eed8380c40f438ce26ea16796762b61d8069fe" Mar 13 01:48:04.352234 master-0 kubenswrapper[19170]: I0313 01:48:04.352120 19170 generic.go:334] "Generic (PLEG): container finished" podID="0939bd46-550b-47b8-b313-eb2432087a7b" containerID="ba45c2533f49b25b22ef9b62119062c185c04b577134457f4ed3e740cc303c42" exitCode=137 Mar 13 01:48:04.352234 master-0 kubenswrapper[19170]: I0313 01:48:04.352169 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0939bd46-550b-47b8-b313-eb2432087a7b","Type":"ContainerDied","Data":"ba45c2533f49b25b22ef9b62119062c185c04b577134457f4ed3e740cc303c42"} Mar 13 01:48:04.352234 master-0 kubenswrapper[19170]: I0313 01:48:04.352202 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0939bd46-550b-47b8-b313-eb2432087a7b","Type":"ContainerDied","Data":"3a6cae12d57f3bfdf8262947312d76d741b7f00285e5dcb241bb793ea7f44bc5"} Mar 13 01:48:04.352234 master-0 kubenswrapper[19170]: I0313 01:48:04.352220 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6cae12d57f3bfdf8262947312d76d741b7f00285e5dcb241bb793ea7f44bc5" Mar 13 01:48:04.389575 master-0 kubenswrapper[19170]: I0313 01:48:04.389491 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:04.402270 master-0 kubenswrapper[19170]: I0313 01:48:04.401280 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:04.450078 master-0 kubenswrapper[19170]: I0313 01:48:04.449967 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data\") pod \"95380989-f152-40e9-82a4-3bc9c091a8db\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " Mar 13 01:48:04.450364 master-0 kubenswrapper[19170]: I0313 01:48:04.450147 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbkdp\" (UniqueName: \"kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp\") pod \"0939bd46-550b-47b8-b313-eb2432087a7b\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " Mar 13 01:48:04.450364 master-0 kubenswrapper[19170]: I0313 01:48:04.450184 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle\") pod \"95380989-f152-40e9-82a4-3bc9c091a8db\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " Mar 13 01:48:04.455211 master-0 kubenswrapper[19170]: I0313 01:48:04.451994 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w4l9\" (UniqueName: \"kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9\") pod \"95380989-f152-40e9-82a4-3bc9c091a8db\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " Mar 13 01:48:04.455211 master-0 kubenswrapper[19170]: I0313 01:48:04.452080 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle\") pod \"0939bd46-550b-47b8-b313-eb2432087a7b\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " Mar 13 01:48:04.455211 master-0 kubenswrapper[19170]: I0313 01:48:04.452258 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data\") pod \"0939bd46-550b-47b8-b313-eb2432087a7b\" (UID: \"0939bd46-550b-47b8-b313-eb2432087a7b\") " Mar 13 01:48:04.455211 master-0 kubenswrapper[19170]: I0313 01:48:04.452357 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs\") pod \"95380989-f152-40e9-82a4-3bc9c091a8db\" (UID: \"95380989-f152-40e9-82a4-3bc9c091a8db\") " Mar 13 01:48:04.455211 master-0 kubenswrapper[19170]: I0313 01:48:04.453594 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs" (OuterVolumeSpecName: "logs") pod "95380989-f152-40e9-82a4-3bc9c091a8db" (UID: "95380989-f152-40e9-82a4-3bc9c091a8db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:48:04.462419 master-0 kubenswrapper[19170]: I0313 01:48:04.462337 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp" (OuterVolumeSpecName: "kube-api-access-cbkdp") pod "0939bd46-550b-47b8-b313-eb2432087a7b" (UID: "0939bd46-550b-47b8-b313-eb2432087a7b"). InnerVolumeSpecName "kube-api-access-cbkdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:04.465144 master-0 kubenswrapper[19170]: I0313 01:48:04.465079 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9" (OuterVolumeSpecName: "kube-api-access-4w4l9") pod "95380989-f152-40e9-82a4-3bc9c091a8db" (UID: "95380989-f152-40e9-82a4-3bc9c091a8db"). InnerVolumeSpecName "kube-api-access-4w4l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:04.488213 master-0 kubenswrapper[19170]: I0313 01:48:04.488131 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data" (OuterVolumeSpecName: "config-data") pod "0939bd46-550b-47b8-b313-eb2432087a7b" (UID: "0939bd46-550b-47b8-b313-eb2432087a7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:04.495469 master-0 kubenswrapper[19170]: I0313 01:48:04.495363 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0939bd46-550b-47b8-b313-eb2432087a7b" (UID: "0939bd46-550b-47b8-b313-eb2432087a7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:04.503792 master-0 kubenswrapper[19170]: I0313 01:48:04.503719 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data" (OuterVolumeSpecName: "config-data") pod "95380989-f152-40e9-82a4-3bc9c091a8db" (UID: "95380989-f152-40e9-82a4-3bc9c091a8db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:04.504668 master-0 kubenswrapper[19170]: I0313 01:48:04.504551 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95380989-f152-40e9-82a4-3bc9c091a8db" (UID: "95380989-f152-40e9-82a4-3bc9c091a8db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:04.554543 master-0 kubenswrapper[19170]: I0313 01:48:04.554474 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbkdp\" (UniqueName: \"kubernetes.io/projected/0939bd46-550b-47b8-b313-eb2432087a7b-kube-api-access-cbkdp\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554543 master-0 kubenswrapper[19170]: I0313 01:48:04.554528 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554543 master-0 kubenswrapper[19170]: I0313 01:48:04.554542 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w4l9\" (UniqueName: \"kubernetes.io/projected/95380989-f152-40e9-82a4-3bc9c091a8db-kube-api-access-4w4l9\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554884 master-0 kubenswrapper[19170]: I0313 01:48:04.554555 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554884 master-0 kubenswrapper[19170]: I0313 01:48:04.554569 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0939bd46-550b-47b8-b313-eb2432087a7b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554884 master-0 kubenswrapper[19170]: I0313 01:48:04.554582 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95380989-f152-40e9-82a4-3bc9c091a8db-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:04.554884 master-0 kubenswrapper[19170]: I0313 01:48:04.554618 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95380989-f152-40e9-82a4-3bc9c091a8db-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:05.367425 master-0 kubenswrapper[19170]: I0313 01:48:05.367366 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.369690 master-0 kubenswrapper[19170]: I0313 01:48:05.367437 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:05.479669 master-0 kubenswrapper[19170]: I0313 01:48:05.475192 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:48:05.485107 master-0 kubenswrapper[19170]: I0313 01:48:05.484866 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:48:05.550710 master-0 kubenswrapper[19170]: I0313 01:48:05.549575 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:05.588832 master-0 kubenswrapper[19170]: I0313 01:48:05.588726 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:48:05.589911 master-0 kubenswrapper[19170]: E0313 01:48:05.589883 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0939bd46-550b-47b8-b313-eb2432087a7b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 01:48:05.590066 master-0 kubenswrapper[19170]: I0313 01:48:05.590044 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="0939bd46-550b-47b8-b313-eb2432087a7b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 01:48:05.590193 master-0 kubenswrapper[19170]: E0313 01:48:05.590176 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-metadata" Mar 13 01:48:05.590280 master-0 kubenswrapper[19170]: I0313 01:48:05.590266 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-metadata" Mar 13 01:48:05.590417 master-0 kubenswrapper[19170]: E0313 01:48:05.590400 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-log" Mar 13 01:48:05.590520 master-0 kubenswrapper[19170]: I0313 01:48:05.590501 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-log" Mar 13 01:48:05.591025 master-0 kubenswrapper[19170]: I0313 01:48:05.591001 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-metadata" Mar 13 01:48:05.591194 master-0 kubenswrapper[19170]: I0313 01:48:05.591169 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="0939bd46-550b-47b8-b313-eb2432087a7b" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 01:48:05.591537 master-0 kubenswrapper[19170]: I0313 01:48:05.591510 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" containerName="nova-metadata-log" Mar 13 01:48:05.592715 master-0 kubenswrapper[19170]: I0313 01:48:05.592685 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.594594 master-0 kubenswrapper[19170]: I0313 01:48:05.594543 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 01:48:05.596620 master-0 kubenswrapper[19170]: I0313 01:48:05.594892 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 01:48:05.596620 master-0 kubenswrapper[19170]: I0313 01:48:05.595187 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 01:48:05.601752 master-0 kubenswrapper[19170]: I0313 01:48:05.600883 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:05.612679 master-0 kubenswrapper[19170]: I0313 01:48:05.612617 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:48:05.634084 master-0 kubenswrapper[19170]: I0313 01:48:05.633804 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:05.637781 master-0 kubenswrapper[19170]: I0313 01:48:05.637738 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:05.641427 master-0 kubenswrapper[19170]: I0313 01:48:05.641123 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 01:48:05.642011 master-0 kubenswrapper[19170]: I0313 01:48:05.641966 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 01:48:05.660715 master-0 kubenswrapper[19170]: I0313 01:48:05.660561 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:05.689771 master-0 kubenswrapper[19170]: I0313 01:48:05.688929 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.689771 master-0 kubenswrapper[19170]: I0313 01:48:05.689037 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.690995 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691064 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691174 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691270 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691325 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691370 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lm9v\" (UniqueName: \"kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691437 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.692738 master-0 kubenswrapper[19170]: I0313 01:48:05.691513 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xdl\" (UniqueName: \"kubernetes.io/projected/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-kube-api-access-h7xdl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.794012 master-0 kubenswrapper[19170]: I0313 01:48:05.793952 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.794012 master-0 kubenswrapper[19170]: I0313 01:48:05.794016 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.794282 master-0 kubenswrapper[19170]: I0313 01:48:05.794068 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.794282 master-0 kubenswrapper[19170]: I0313 01:48:05.794232 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.794395 master-0 kubenswrapper[19170]: I0313 01:48:05.794305 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.794444 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lm9v\" (UniqueName: \"kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.794501 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.794698 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xdl\" (UniqueName: \"kubernetes.io/projected/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-kube-api-access-h7xdl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.794919 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.795006 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.795611 master-0 kubenswrapper[19170]: I0313 01:48:05.795344 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.799113 master-0 kubenswrapper[19170]: I0313 01:48:05.799058 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.799271 master-0 kubenswrapper[19170]: I0313 01:48:05.799135 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.800486 master-0 kubenswrapper[19170]: I0313 01:48:05.800445 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.801212 master-0 kubenswrapper[19170]: I0313 01:48:05.801147 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.801837 master-0 kubenswrapper[19170]: I0313 01:48:05.801787 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.802317 master-0 kubenswrapper[19170]: I0313 01:48:05.802254 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.807744 master-0 kubenswrapper[19170]: I0313 01:48:05.807683 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.811768 master-0 kubenswrapper[19170]: I0313 01:48:05.811713 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lm9v\" (UniqueName: \"kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v\") pod \"nova-metadata-0\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " pod="openstack/nova-metadata-0" Mar 13 01:48:05.814826 master-0 kubenswrapper[19170]: I0313 01:48:05.814756 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xdl\" (UniqueName: \"kubernetes.io/projected/1aa8db4a-1c6d-42fd-bb71-ff4664c74889-kube-api-access-h7xdl\") pod \"nova-cell1-novncproxy-0\" (UID: \"1aa8db4a-1c6d-42fd-bb71-ff4664c74889\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.913571 master-0 kubenswrapper[19170]: I0313 01:48:05.913505 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:05.965302 master-0 kubenswrapper[19170]: I0313 01:48:05.965225 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:06.515267 master-0 kubenswrapper[19170]: W0313 01:48:06.507647 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa8db4a_1c6d_42fd_bb71_ff4664c74889.slice/crio-cf0a4e49c20f307d39dbd7737c2ee60126fe0660f3b228fdc7a0ee73cd647911 WatchSource:0}: Error finding container cf0a4e49c20f307d39dbd7737c2ee60126fe0660f3b228fdc7a0ee73cd647911: Status 404 returned error can't find the container with id cf0a4e49c20f307d39dbd7737c2ee60126fe0660f3b228fdc7a0ee73cd647911 Mar 13 01:48:06.520270 master-0 kubenswrapper[19170]: I0313 01:48:06.520212 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 01:48:06.551195 master-0 kubenswrapper[19170]: I0313 01:48:06.551152 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:06.552886 master-0 kubenswrapper[19170]: W0313 01:48:06.552846 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod598c8651_f44d_4307_bdea_832b32b84007.slice/crio-7a6e0b568be197364a8e0b61cd95d01dcb36d7720cd6efd9e93faccf807166fe WatchSource:0}: Error finding container 7a6e0b568be197364a8e0b61cd95d01dcb36d7720cd6efd9e93faccf807166fe: Status 404 returned error can't find the container with id 7a6e0b568be197364a8e0b61cd95d01dcb36d7720cd6efd9e93faccf807166fe Mar 13 01:48:07.418070 master-0 kubenswrapper[19170]: I0313 01:48:07.417997 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerStarted","Data":"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6"} Mar 13 01:48:07.418070 master-0 kubenswrapper[19170]: I0313 01:48:07.418054 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerStarted","Data":"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5"} Mar 13 01:48:07.418070 master-0 kubenswrapper[19170]: I0313 01:48:07.418067 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerStarted","Data":"7a6e0b568be197364a8e0b61cd95d01dcb36d7720cd6efd9e93faccf807166fe"} Mar 13 01:48:07.451153 master-0 kubenswrapper[19170]: I0313 01:48:07.451059 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.451038692 podStartE2EDuration="2.451038692s" podCreationTimestamp="2026-03-13 01:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:07.442489563 +0000 UTC m=+1748.250610553" watchObservedRunningTime="2026-03-13 01:48:07.451038692 +0000 UTC m=+1748.259159652" Mar 13 01:48:07.474447 master-0 kubenswrapper[19170]: I0313 01:48:07.472184 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0939bd46-550b-47b8-b313-eb2432087a7b" path="/var/lib/kubelet/pods/0939bd46-550b-47b8-b313-eb2432087a7b/volumes" Mar 13 01:48:07.474447 master-0 kubenswrapper[19170]: I0313 01:48:07.472962 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95380989-f152-40e9-82a4-3bc9c091a8db" path="/var/lib/kubelet/pods/95380989-f152-40e9-82a4-3bc9c091a8db/volumes" Mar 13 01:48:07.474447 master-0 kubenswrapper[19170]: I0313 01:48:07.473818 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1aa8db4a-1c6d-42fd-bb71-ff4664c74889","Type":"ContainerStarted","Data":"15d09515e8469b827bc0e6b4334ba773efc28fe76ff892361d60513437827db2"} Mar 13 01:48:07.474447 master-0 kubenswrapper[19170]: I0313 01:48:07.473849 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1aa8db4a-1c6d-42fd-bb71-ff4664c74889","Type":"ContainerStarted","Data":"cf0a4e49c20f307d39dbd7737c2ee60126fe0660f3b228fdc7a0ee73cd647911"} Mar 13 01:48:07.508548 master-0 kubenswrapper[19170]: I0313 01:48:07.508448 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.508425995 podStartE2EDuration="2.508425995s" podCreationTimestamp="2026-03-13 01:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:07.472134622 +0000 UTC m=+1748.280255602" watchObservedRunningTime="2026-03-13 01:48:07.508425995 +0000 UTC m=+1748.316546965" Mar 13 01:48:09.568363 master-0 kubenswrapper[19170]: I0313 01:48:09.568122 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 01:48:09.569115 master-0 kubenswrapper[19170]: I0313 01:48:09.568918 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 01:48:09.569115 master-0 kubenswrapper[19170]: I0313 01:48:09.568960 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 01:48:09.580065 master-0 kubenswrapper[19170]: I0313 01:48:09.579465 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 01:48:10.480358 master-0 kubenswrapper[19170]: I0313 01:48:10.480277 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 01:48:10.484425 master-0 kubenswrapper[19170]: I0313 01:48:10.484352 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 01:48:10.800722 master-0 kubenswrapper[19170]: I0313 01:48:10.800670 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94597dfc-nlqdq"] Mar 13 01:48:10.805042 master-0 kubenswrapper[19170]: I0313 01:48:10.804435 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.874894 master-0 kubenswrapper[19170]: I0313 01:48:10.874825 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94597dfc-nlqdq"] Mar 13 01:48:10.914302 master-0 kubenswrapper[19170]: I0313 01:48:10.914208 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:10.964475 master-0 kubenswrapper[19170]: I0313 01:48:10.964297 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-svc\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.965154 master-0 kubenswrapper[19170]: I0313 01:48:10.964608 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-config\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.965978 master-0 kubenswrapper[19170]: I0313 01:48:10.965920 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:48:10.966047 master-0 kubenswrapper[19170]: I0313 01:48:10.965998 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4h5w\" (UniqueName: \"kubernetes.io/projected/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-kube-api-access-q4h5w\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.966773 master-0 kubenswrapper[19170]: I0313 01:48:10.966365 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:48:10.970275 master-0 kubenswrapper[19170]: I0313 01:48:10.969877 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-nb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.970275 master-0 kubenswrapper[19170]: I0313 01:48:10.969943 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-swift-storage-0\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:10.970275 master-0 kubenswrapper[19170]: I0313 01:48:10.970196 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-sb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.073019 master-0 kubenswrapper[19170]: I0313 01:48:11.072847 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4h5w\" (UniqueName: \"kubernetes.io/projected/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-kube-api-access-q4h5w\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.073019 master-0 kubenswrapper[19170]: I0313 01:48:11.072946 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-nb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.073019 master-0 kubenswrapper[19170]: I0313 01:48:11.072972 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-swift-storage-0\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.073331 master-0 kubenswrapper[19170]: I0313 01:48:11.073052 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-sb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.073331 master-0 kubenswrapper[19170]: I0313 01:48:11.073118 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-svc\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.074009 master-0 kubenswrapper[19170]: I0313 01:48:11.073973 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-svc\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.074090 master-0 kubenswrapper[19170]: I0313 01:48:11.074056 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-dns-swift-storage-0\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.074404 master-0 kubenswrapper[19170]: I0313 01:48:11.074207 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-config\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.074404 master-0 kubenswrapper[19170]: I0313 01:48:11.074222 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-sb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.075027 master-0 kubenswrapper[19170]: I0313 01:48:11.074907 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-ovsdbserver-nb\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.075172 master-0 kubenswrapper[19170]: I0313 01:48:11.075070 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-config\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.091215 master-0 kubenswrapper[19170]: I0313 01:48:11.091153 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4h5w\" (UniqueName: \"kubernetes.io/projected/923ad663-dbaa-4e1c-ae76-a4d3b73abd09-kube-api-access-q4h5w\") pod \"dnsmasq-dns-94597dfc-nlqdq\" (UID: \"923ad663-dbaa-4e1c-ae76-a4d3b73abd09\") " pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.160112 master-0 kubenswrapper[19170]: I0313 01:48:11.160052 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:11.704340 master-0 kubenswrapper[19170]: I0313 01:48:11.703697 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94597dfc-nlqdq"] Mar 13 01:48:12.539920 master-0 kubenswrapper[19170]: I0313 01:48:12.539882 19170 generic.go:334] "Generic (PLEG): container finished" podID="923ad663-dbaa-4e1c-ae76-a4d3b73abd09" containerID="df63c1102112125cfab61a23ebdef561ea8c46b257611aa6ff14b937acbb841d" exitCode=0 Mar 13 01:48:12.540554 master-0 kubenswrapper[19170]: I0313 01:48:12.539992 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" event={"ID":"923ad663-dbaa-4e1c-ae76-a4d3b73abd09","Type":"ContainerDied","Data":"df63c1102112125cfab61a23ebdef561ea8c46b257611aa6ff14b937acbb841d"} Mar 13 01:48:12.540615 master-0 kubenswrapper[19170]: I0313 01:48:12.540581 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" event={"ID":"923ad663-dbaa-4e1c-ae76-a4d3b73abd09","Type":"ContainerStarted","Data":"2211b1b6dc09c837c02fa8a00515d79251549cee0074d78105d62726a52fdcc5"} Mar 13 01:48:13.105332 master-0 kubenswrapper[19170]: I0313 01:48:13.105170 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:13.556784 master-0 kubenswrapper[19170]: I0313 01:48:13.556574 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-log" containerID="cri-o://dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a" gracePeriod=30 Mar 13 01:48:13.557801 master-0 kubenswrapper[19170]: I0313 01:48:13.557769 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" event={"ID":"923ad663-dbaa-4e1c-ae76-a4d3b73abd09","Type":"ContainerStarted","Data":"93584cfedd0f72a83344f190bc85cc85805f40404a096153436dd4107d9a1dfd"} Mar 13 01:48:13.557883 master-0 kubenswrapper[19170]: I0313 01:48:13.557808 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:13.558653 master-0 kubenswrapper[19170]: I0313 01:48:13.558154 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-api" containerID="cri-o://9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835" gracePeriod=30 Mar 13 01:48:13.599996 master-0 kubenswrapper[19170]: I0313 01:48:13.599902 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" podStartSLOduration=3.59988201 podStartE2EDuration="3.59988201s" podCreationTimestamp="2026-03-13 01:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:13.588169953 +0000 UTC m=+1754.396290913" watchObservedRunningTime="2026-03-13 01:48:13.59988201 +0000 UTC m=+1754.408002970" Mar 13 01:48:14.573498 master-0 kubenswrapper[19170]: I0313 01:48:14.573429 19170 generic.go:334] "Generic (PLEG): container finished" podID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerID="dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a" exitCode=143 Mar 13 01:48:14.574050 master-0 kubenswrapper[19170]: I0313 01:48:14.573534 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerDied","Data":"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a"} Mar 13 01:48:15.914207 master-0 kubenswrapper[19170]: I0313 01:48:15.914082 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:15.952831 master-0 kubenswrapper[19170]: I0313 01:48:15.951333 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:15.971699 master-0 kubenswrapper[19170]: I0313 01:48:15.970568 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 01:48:15.971699 master-0 kubenswrapper[19170]: I0313 01:48:15.970673 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 01:48:16.658529 master-0 kubenswrapper[19170]: I0313 01:48:16.658445 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 01:48:16.982289 master-0 kubenswrapper[19170]: I0313 01:48:16.981816 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:16.983406 master-0 kubenswrapper[19170]: I0313 01:48:16.983335 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:17.008671 master-0 kubenswrapper[19170]: I0313 01:48:17.007694 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hn6dr"] Mar 13 01:48:17.009410 master-0 kubenswrapper[19170]: I0313 01:48:17.009272 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.015848 master-0 kubenswrapper[19170]: I0313 01:48:17.015798 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 01:48:17.017010 master-0 kubenswrapper[19170]: I0313 01:48:17.016372 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 01:48:17.048742 master-0 kubenswrapper[19170]: I0313 01:48:17.044778 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-6llfb"] Mar 13 01:48:17.048742 master-0 kubenswrapper[19170]: I0313 01:48:17.046263 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.079442 master-0 kubenswrapper[19170]: I0313 01:48:17.075070 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hn6dr"] Mar 13 01:48:17.128650 master-0 kubenswrapper[19170]: I0313 01:48:17.128569 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-6llfb"] Mar 13 01:48:17.177304 master-0 kubenswrapper[19170]: I0313 01:48:17.177221 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.177502 master-0 kubenswrapper[19170]: I0313 01:48:17.177332 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpgtd\" (UniqueName: \"kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.177546 master-0 kubenswrapper[19170]: I0313 01:48:17.177479 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjvq\" (UniqueName: \"kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.177908 master-0 kubenswrapper[19170]: I0313 01:48:17.177786 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.177961 master-0 kubenswrapper[19170]: I0313 01:48:17.177931 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.179274 master-0 kubenswrapper[19170]: I0313 01:48:17.178164 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.179274 master-0 kubenswrapper[19170]: I0313 01:48:17.178244 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.179274 master-0 kubenswrapper[19170]: I0313 01:48:17.178362 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.280687 master-0 kubenswrapper[19170]: I0313 01:48:17.280573 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.280863 master-0 kubenswrapper[19170]: I0313 01:48:17.280695 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.280863 master-0 kubenswrapper[19170]: I0313 01:48:17.280727 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.280863 master-0 kubenswrapper[19170]: I0313 01:48:17.280765 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.280863 master-0 kubenswrapper[19170]: I0313 01:48:17.280840 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.281001 master-0 kubenswrapper[19170]: I0313 01:48:17.280862 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpgtd\" (UniqueName: \"kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.281001 master-0 kubenswrapper[19170]: I0313 01:48:17.280893 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjvq\" (UniqueName: \"kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.281001 master-0 kubenswrapper[19170]: I0313 01:48:17.280941 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.291743 master-0 kubenswrapper[19170]: I0313 01:48:17.285211 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.291743 master-0 kubenswrapper[19170]: I0313 01:48:17.286080 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.291743 master-0 kubenswrapper[19170]: I0313 01:48:17.289868 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:17.291743 master-0 kubenswrapper[19170]: I0313 01:48:17.289871 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.291743 master-0 kubenswrapper[19170]: I0313 01:48:17.290411 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.293172 master-0 kubenswrapper[19170]: I0313 01:48:17.293152 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.296480 master-0 kubenswrapper[19170]: I0313 01:48:17.296449 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.298771 master-0 kubenswrapper[19170]: I0313 01:48:17.298745 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpgtd\" (UniqueName: \"kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd\") pod \"nova-cell1-cell-mapping-hn6dr\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.301982 master-0 kubenswrapper[19170]: I0313 01:48:17.301965 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjvq\" (UniqueName: \"kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq\") pod \"nova-cell1-host-discover-6llfb\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.394205 master-0 kubenswrapper[19170]: I0313 01:48:17.393728 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:17.415507 master-0 kubenswrapper[19170]: I0313 01:48:17.415011 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:17.425099 master-0 kubenswrapper[19170]: I0313 01:48:17.425065 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle\") pod \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " Mar 13 01:48:17.425430 master-0 kubenswrapper[19170]: I0313 01:48:17.425412 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4jw5\" (UniqueName: \"kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5\") pod \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " Mar 13 01:48:17.425576 master-0 kubenswrapper[19170]: I0313 01:48:17.425563 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs\") pod \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " Mar 13 01:48:17.425802 master-0 kubenswrapper[19170]: I0313 01:48:17.425782 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data\") pod \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\" (UID: \"f61f4a6e-39e1-426c-9b4d-5a9535df29d8\") " Mar 13 01:48:17.426980 master-0 kubenswrapper[19170]: I0313 01:48:17.426458 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs" (OuterVolumeSpecName: "logs") pod "f61f4a6e-39e1-426c-9b4d-5a9535df29d8" (UID: "f61f4a6e-39e1-426c-9b4d-5a9535df29d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:48:17.430795 master-0 kubenswrapper[19170]: I0313 01:48:17.430752 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5" (OuterVolumeSpecName: "kube-api-access-g4jw5") pod "f61f4a6e-39e1-426c-9b4d-5a9535df29d8" (UID: "f61f4a6e-39e1-426c-9b4d-5a9535df29d8"). InnerVolumeSpecName "kube-api-access-g4jw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:17.463745 master-0 kubenswrapper[19170]: I0313 01:48:17.462586 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f61f4a6e-39e1-426c-9b4d-5a9535df29d8" (UID: "f61f4a6e-39e1-426c-9b4d-5a9535df29d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:17.465835 master-0 kubenswrapper[19170]: I0313 01:48:17.465786 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data" (OuterVolumeSpecName: "config-data") pod "f61f4a6e-39e1-426c-9b4d-5a9535df29d8" (UID: "f61f4a6e-39e1-426c-9b4d-5a9535df29d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:17.536675 master-0 kubenswrapper[19170]: I0313 01:48:17.536357 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:17.536675 master-0 kubenswrapper[19170]: I0313 01:48:17.536401 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4jw5\" (UniqueName: \"kubernetes.io/projected/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-kube-api-access-g4jw5\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:17.536675 master-0 kubenswrapper[19170]: I0313 01:48:17.536415 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:17.536675 master-0 kubenswrapper[19170]: I0313 01:48:17.536426 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f61f4a6e-39e1-426c-9b4d-5a9535df29d8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:17.652386 master-0 kubenswrapper[19170]: I0313 01:48:17.652328 19170 generic.go:334] "Generic (PLEG): container finished" podID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerID="9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835" exitCode=0 Mar 13 01:48:17.652828 master-0 kubenswrapper[19170]: I0313 01:48:17.652791 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:17.653295 master-0 kubenswrapper[19170]: I0313 01:48:17.653260 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerDied","Data":"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835"} Mar 13 01:48:17.653344 master-0 kubenswrapper[19170]: I0313 01:48:17.653299 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f61f4a6e-39e1-426c-9b4d-5a9535df29d8","Type":"ContainerDied","Data":"9f43025aa3f2ff1e24a32c13ae0d7b92269e3f8e37b36dce4f2d3a698e26855f"} Mar 13 01:48:17.653344 master-0 kubenswrapper[19170]: I0313 01:48:17.653316 19170 scope.go:117] "RemoveContainer" containerID="9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835" Mar 13 01:48:17.686999 master-0 kubenswrapper[19170]: I0313 01:48:17.686954 19170 scope.go:117] "RemoveContainer" containerID="dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a" Mar 13 01:48:17.744698 master-0 kubenswrapper[19170]: I0313 01:48:17.742539 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:17.777748 master-0 kubenswrapper[19170]: I0313 01:48:17.777536 19170 scope.go:117] "RemoveContainer" containerID="9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835" Mar 13 01:48:17.780134 master-0 kubenswrapper[19170]: E0313 01:48:17.780080 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835\": container with ID starting with 9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835 not found: ID does not exist" containerID="9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835" Mar 13 01:48:17.780580 master-0 kubenswrapper[19170]: I0313 01:48:17.780153 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835"} err="failed to get container status \"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835\": rpc error: code = NotFound desc = could not find container \"9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835\": container with ID starting with 9dd88a61bbda3e448c9fccccdd4c9f368767fa4b330ac2bf32f0a0e6c88e8835 not found: ID does not exist" Mar 13 01:48:17.780580 master-0 kubenswrapper[19170]: I0313 01:48:17.780576 19170 scope.go:117] "RemoveContainer" containerID="dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a" Mar 13 01:48:17.781050 master-0 kubenswrapper[19170]: E0313 01:48:17.781000 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a\": container with ID starting with dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a not found: ID does not exist" containerID="dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a" Mar 13 01:48:17.781094 master-0 kubenswrapper[19170]: I0313 01:48:17.781052 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a"} err="failed to get container status \"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a\": rpc error: code = NotFound desc = could not find container \"dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a\": container with ID starting with dcce912749dec2e11833ef057a845fb51ff4b9343f59c5a4e8400e1211e8229a not found: ID does not exist" Mar 13 01:48:17.784381 master-0 kubenswrapper[19170]: I0313 01:48:17.784328 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.803508 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: E0313 01:48:17.804059 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-log" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.804073 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-log" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: E0313 01:48:17.804097 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-api" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.804105 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-api" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.804360 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-log" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.804389 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" containerName="nova-api-api" Mar 13 01:48:17.805746 master-0 kubenswrapper[19170]: I0313 01:48:17.805661 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:17.816573 master-0 kubenswrapper[19170]: I0313 01:48:17.808225 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 01:48:17.816573 master-0 kubenswrapper[19170]: I0313 01:48:17.808269 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 01:48:17.816573 master-0 kubenswrapper[19170]: I0313 01:48:17.808579 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 01:48:17.827801 master-0 kubenswrapper[19170]: I0313 01:48:17.822077 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.846710 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k646t\" (UniqueName: \"kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.846783 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.846927 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.846990 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.847012 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.847589 master-0 kubenswrapper[19170]: I0313 01:48:17.847039 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.935651 master-0 kubenswrapper[19170]: I0313 01:48:17.935602 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-6llfb"] Mar 13 01:48:17.951485 master-0 kubenswrapper[19170]: I0313 01:48:17.950881 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.951678 master-0 kubenswrapper[19170]: I0313 01:48:17.951448 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.951906 master-0 kubenswrapper[19170]: I0313 01:48:17.951609 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.951906 master-0 kubenswrapper[19170]: I0313 01:48:17.951856 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.952023 master-0 kubenswrapper[19170]: I0313 01:48:17.951993 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.953036 master-0 kubenswrapper[19170]: I0313 01:48:17.953006 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k646t\" (UniqueName: \"kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.953100 master-0 kubenswrapper[19170]: I0313 01:48:17.953063 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.958778 master-0 kubenswrapper[19170]: I0313 01:48:17.956374 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.958778 master-0 kubenswrapper[19170]: I0313 01:48:17.956829 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.958778 master-0 kubenswrapper[19170]: I0313 01:48:17.957611 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.964346 master-0 kubenswrapper[19170]: I0313 01:48:17.964321 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:17.978815 master-0 kubenswrapper[19170]: I0313 01:48:17.978701 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k646t\" (UniqueName: \"kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t\") pod \"nova-api-0\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " pod="openstack/nova-api-0" Mar 13 01:48:18.041665 master-0 kubenswrapper[19170]: I0313 01:48:18.039822 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hn6dr"] Mar 13 01:48:18.053936 master-0 kubenswrapper[19170]: W0313 01:48:18.053805 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18bae58b_c530_4053_add2_e0650c3fdbe5.slice/crio-787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5 WatchSource:0}: Error finding container 787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5: Status 404 returned error can't find the container with id 787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5 Mar 13 01:48:18.187284 master-0 kubenswrapper[19170]: I0313 01:48:18.187174 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:18.682008 master-0 kubenswrapper[19170]: I0313 01:48:18.681129 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hn6dr" event={"ID":"18bae58b-c530-4053-add2-e0650c3fdbe5","Type":"ContainerStarted","Data":"e23951b36e04ccdb8be1d12074f3f357efa90eefd1b8df83b67972f41b332d49"} Mar 13 01:48:18.682008 master-0 kubenswrapper[19170]: I0313 01:48:18.681200 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hn6dr" event={"ID":"18bae58b-c530-4053-add2-e0650c3fdbe5","Type":"ContainerStarted","Data":"787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5"} Mar 13 01:48:18.687550 master-0 kubenswrapper[19170]: I0313 01:48:18.686452 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:18.687550 master-0 kubenswrapper[19170]: I0313 01:48:18.687008 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-6llfb" event={"ID":"afae2f07-106f-4fd9-a675-1dd722a9123f","Type":"ContainerStarted","Data":"1fe9cdc889c464d4fe6f9730a6c9f32f8c80f3f74efdb841006f5f289d154a17"} Mar 13 01:48:18.687550 master-0 kubenswrapper[19170]: I0313 01:48:18.687080 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-6llfb" event={"ID":"afae2f07-106f-4fd9-a675-1dd722a9123f","Type":"ContainerStarted","Data":"f20b56d5e4148fa55f431509301e405aa2fb68e1836d212e752368e273d2d982"} Mar 13 01:48:18.716522 master-0 kubenswrapper[19170]: I0313 01:48:18.716433 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hn6dr" podStartSLOduration=2.7164077669999998 podStartE2EDuration="2.716407767s" podCreationTimestamp="2026-03-13 01:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:18.70077652 +0000 UTC m=+1759.508897500" watchObservedRunningTime="2026-03-13 01:48:18.716407767 +0000 UTC m=+1759.524528737" Mar 13 01:48:18.754075 master-0 kubenswrapper[19170]: I0313 01:48:18.753790 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-6llfb" podStartSLOduration=2.753768431 podStartE2EDuration="2.753768431s" podCreationTimestamp="2026-03-13 01:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:18.731398086 +0000 UTC m=+1759.539519056" watchObservedRunningTime="2026-03-13 01:48:18.753768431 +0000 UTC m=+1759.561889401" Mar 13 01:48:19.462751 master-0 kubenswrapper[19170]: I0313 01:48:19.452604 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f61f4a6e-39e1-426c-9b4d-5a9535df29d8" path="/var/lib/kubelet/pods/f61f4a6e-39e1-426c-9b4d-5a9535df29d8/volumes" Mar 13 01:48:19.703990 master-0 kubenswrapper[19170]: I0313 01:48:19.703922 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerStarted","Data":"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995"} Mar 13 01:48:19.703990 master-0 kubenswrapper[19170]: I0313 01:48:19.703995 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerStarted","Data":"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a"} Mar 13 01:48:19.704456 master-0 kubenswrapper[19170]: I0313 01:48:19.704012 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerStarted","Data":"931e0ae93d41b023f245c4f63dadf7dd98b2eef761c484e18c993af1e8300c39"} Mar 13 01:48:19.767442 master-0 kubenswrapper[19170]: I0313 01:48:19.767263 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.767240536 podStartE2EDuration="2.767240536s" podCreationTimestamp="2026-03-13 01:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:19.746586079 +0000 UTC m=+1760.554707079" watchObservedRunningTime="2026-03-13 01:48:19.767240536 +0000 UTC m=+1760.575361506" Mar 13 01:48:20.721128 master-0 kubenswrapper[19170]: I0313 01:48:20.721060 19170 generic.go:334] "Generic (PLEG): container finished" podID="afae2f07-106f-4fd9-a675-1dd722a9123f" containerID="1fe9cdc889c464d4fe6f9730a6c9f32f8c80f3f74efdb841006f5f289d154a17" exitCode=0 Mar 13 01:48:20.722480 master-0 kubenswrapper[19170]: I0313 01:48:20.722440 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-6llfb" event={"ID":"afae2f07-106f-4fd9-a675-1dd722a9123f","Type":"ContainerDied","Data":"1fe9cdc889c464d4fe6f9730a6c9f32f8c80f3f74efdb841006f5f289d154a17"} Mar 13 01:48:21.163230 master-0 kubenswrapper[19170]: I0313 01:48:21.163126 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94597dfc-nlqdq" Mar 13 01:48:21.313657 master-0 kubenswrapper[19170]: I0313 01:48:21.313580 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:48:21.314309 master-0 kubenswrapper[19170]: I0313 01:48:21.314204 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="dnsmasq-dns" containerID="cri-o://90e43b91d2a8801429ec9eb529c7980bb057d8cf813efa37bbf0b220d8777cb3" gracePeriod=10 Mar 13 01:48:21.745040 master-0 kubenswrapper[19170]: I0313 01:48:21.734934 19170 generic.go:334] "Generic (PLEG): container finished" podID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerID="90e43b91d2a8801429ec9eb529c7980bb057d8cf813efa37bbf0b220d8777cb3" exitCode=0 Mar 13 01:48:21.745040 master-0 kubenswrapper[19170]: I0313 01:48:21.734978 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" event={"ID":"3b8e39a7-f183-4541-82e3-fdcdc6936300","Type":"ContainerDied","Data":"90e43b91d2a8801429ec9eb529c7980bb057d8cf813efa37bbf0b220d8777cb3"} Mar 13 01:48:21.968713 master-0 kubenswrapper[19170]: I0313 01:48:21.968027 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:48:22.063936 master-0 kubenswrapper[19170]: I0313 01:48:22.063823 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.063936 master-0 kubenswrapper[19170]: I0313 01:48:22.063918 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.064153 master-0 kubenswrapper[19170]: I0313 01:48:22.063942 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7lml\" (UniqueName: \"kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.064153 master-0 kubenswrapper[19170]: I0313 01:48:22.063968 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.064153 master-0 kubenswrapper[19170]: I0313 01:48:22.064086 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.064254 master-0 kubenswrapper[19170]: I0313 01:48:22.064183 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc\") pod \"3b8e39a7-f183-4541-82e3-fdcdc6936300\" (UID: \"3b8e39a7-f183-4541-82e3-fdcdc6936300\") " Mar 13 01:48:22.069520 master-0 kubenswrapper[19170]: I0313 01:48:22.069452 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml" (OuterVolumeSpecName: "kube-api-access-x7lml") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "kube-api-access-x7lml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:22.187982 master-0 kubenswrapper[19170]: I0313 01:48:22.187934 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7lml\" (UniqueName: \"kubernetes.io/projected/3b8e39a7-f183-4541-82e3-fdcdc6936300-kube-api-access-x7lml\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.193680 master-0 kubenswrapper[19170]: I0313 01:48:22.192151 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config" (OuterVolumeSpecName: "config") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:48:22.196087 master-0 kubenswrapper[19170]: I0313 01:48:22.196039 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:48:22.198839 master-0 kubenswrapper[19170]: I0313 01:48:22.198749 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:48:22.202253 master-0 kubenswrapper[19170]: I0313 01:48:22.202195 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:48:22.213721 master-0 kubenswrapper[19170]: I0313 01:48:22.213651 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b8e39a7-f183-4541-82e3-fdcdc6936300" (UID: "3b8e39a7-f183-4541-82e3-fdcdc6936300"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:48:22.249318 master-0 kubenswrapper[19170]: I0313 01:48:22.249270 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:22.301720 master-0 kubenswrapper[19170]: I0313 01:48:22.301615 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjjvq\" (UniqueName: \"kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq\") pod \"afae2f07-106f-4fd9-a675-1dd722a9123f\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " Mar 13 01:48:22.301943 master-0 kubenswrapper[19170]: I0313 01:48:22.301822 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data\") pod \"afae2f07-106f-4fd9-a675-1dd722a9123f\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " Mar 13 01:48:22.301943 master-0 kubenswrapper[19170]: I0313 01:48:22.301892 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts\") pod \"afae2f07-106f-4fd9-a675-1dd722a9123f\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " Mar 13 01:48:22.302014 master-0 kubenswrapper[19170]: I0313 01:48:22.301974 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle\") pod \"afae2f07-106f-4fd9-a675-1dd722a9123f\" (UID: \"afae2f07-106f-4fd9-a675-1dd722a9123f\") " Mar 13 01:48:22.302581 master-0 kubenswrapper[19170]: I0313 01:48:22.302554 19170 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.302581 master-0 kubenswrapper[19170]: I0313 01:48:22.302575 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.302707 master-0 kubenswrapper[19170]: I0313 01:48:22.302587 19170 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.302707 master-0 kubenswrapper[19170]: I0313 01:48:22.302598 19170 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.302707 master-0 kubenswrapper[19170]: I0313 01:48:22.302607 19170 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b8e39a7-f183-4541-82e3-fdcdc6936300-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.304652 master-0 kubenswrapper[19170]: I0313 01:48:22.304597 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq" (OuterVolumeSpecName: "kube-api-access-pjjvq") pod "afae2f07-106f-4fd9-a675-1dd722a9123f" (UID: "afae2f07-106f-4fd9-a675-1dd722a9123f"). InnerVolumeSpecName "kube-api-access-pjjvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:22.306040 master-0 kubenswrapper[19170]: I0313 01:48:22.306003 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts" (OuterVolumeSpecName: "scripts") pod "afae2f07-106f-4fd9-a675-1dd722a9123f" (UID: "afae2f07-106f-4fd9-a675-1dd722a9123f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:22.328104 master-0 kubenswrapper[19170]: I0313 01:48:22.327972 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afae2f07-106f-4fd9-a675-1dd722a9123f" (UID: "afae2f07-106f-4fd9-a675-1dd722a9123f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:22.339785 master-0 kubenswrapper[19170]: I0313 01:48:22.339732 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data" (OuterVolumeSpecName: "config-data") pod "afae2f07-106f-4fd9-a675-1dd722a9123f" (UID: "afae2f07-106f-4fd9-a675-1dd722a9123f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:22.405468 master-0 kubenswrapper[19170]: I0313 01:48:22.405021 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.405468 master-0 kubenswrapper[19170]: I0313 01:48:22.405067 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjjvq\" (UniqueName: \"kubernetes.io/projected/afae2f07-106f-4fd9-a675-1dd722a9123f-kube-api-access-pjjvq\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.405468 master-0 kubenswrapper[19170]: I0313 01:48:22.405081 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.405468 master-0 kubenswrapper[19170]: I0313 01:48:22.405092 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afae2f07-106f-4fd9-a675-1dd722a9123f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:22.748592 master-0 kubenswrapper[19170]: I0313 01:48:22.748533 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" event={"ID":"3b8e39a7-f183-4541-82e3-fdcdc6936300","Type":"ContainerDied","Data":"3277f8844189125723e25cf659af9f9428374aa03e67229d1d432b156873f3fc"} Mar 13 01:48:22.749193 master-0 kubenswrapper[19170]: I0313 01:48:22.748602 19170 scope.go:117] "RemoveContainer" containerID="90e43b91d2a8801429ec9eb529c7980bb057d8cf813efa37bbf0b220d8777cb3" Mar 13 01:48:22.749193 master-0 kubenswrapper[19170]: I0313 01:48:22.748601 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d8bd468f-mrpg6" Mar 13 01:48:22.751005 master-0 kubenswrapper[19170]: I0313 01:48:22.750873 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-6llfb" event={"ID":"afae2f07-106f-4fd9-a675-1dd722a9123f","Type":"ContainerDied","Data":"f20b56d5e4148fa55f431509301e405aa2fb68e1836d212e752368e273d2d982"} Mar 13 01:48:22.751005 master-0 kubenswrapper[19170]: I0313 01:48:22.750893 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20b56d5e4148fa55f431509301e405aa2fb68e1836d212e752368e273d2d982" Mar 13 01:48:22.751005 master-0 kubenswrapper[19170]: I0313 01:48:22.750941 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-6llfb" Mar 13 01:48:22.819436 master-0 kubenswrapper[19170]: I0313 01:48:22.817716 19170 scope.go:117] "RemoveContainer" containerID="e4e50e30401c7daf6dbdb803eec069e8420f727e3372775f53e3ef92415269a4" Mar 13 01:48:22.824313 master-0 kubenswrapper[19170]: I0313 01:48:22.824260 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:48:22.854039 master-0 kubenswrapper[19170]: I0313 01:48:22.853987 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58d8bd468f-mrpg6"] Mar 13 01:48:23.440097 master-0 kubenswrapper[19170]: I0313 01:48:23.439978 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" path="/var/lib/kubelet/pods/3b8e39a7-f183-4541-82e3-fdcdc6936300/volumes" Mar 13 01:48:23.766960 master-0 kubenswrapper[19170]: I0313 01:48:23.766910 19170 generic.go:334] "Generic (PLEG): container finished" podID="18bae58b-c530-4053-add2-e0650c3fdbe5" containerID="e23951b36e04ccdb8be1d12074f3f357efa90eefd1b8df83b67972f41b332d49" exitCode=0 Mar 13 01:48:23.767616 master-0 kubenswrapper[19170]: I0313 01:48:23.767033 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hn6dr" event={"ID":"18bae58b-c530-4053-add2-e0650c3fdbe5","Type":"ContainerDied","Data":"e23951b36e04ccdb8be1d12074f3f357efa90eefd1b8df83b67972f41b332d49"} Mar 13 01:48:25.321248 master-0 kubenswrapper[19170]: I0313 01:48:25.321209 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:25.386016 master-0 kubenswrapper[19170]: I0313 01:48:25.385958 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts\") pod \"18bae58b-c530-4053-add2-e0650c3fdbe5\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " Mar 13 01:48:25.390862 master-0 kubenswrapper[19170]: I0313 01:48:25.390803 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts" (OuterVolumeSpecName: "scripts") pod "18bae58b-c530-4053-add2-e0650c3fdbe5" (UID: "18bae58b-c530-4053-add2-e0650c3fdbe5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:25.487479 master-0 kubenswrapper[19170]: I0313 01:48:25.487345 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle\") pod \"18bae58b-c530-4053-add2-e0650c3fdbe5\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " Mar 13 01:48:25.487479 master-0 kubenswrapper[19170]: I0313 01:48:25.487409 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpgtd\" (UniqueName: \"kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd\") pod \"18bae58b-c530-4053-add2-e0650c3fdbe5\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " Mar 13 01:48:25.488217 master-0 kubenswrapper[19170]: I0313 01:48:25.488173 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data\") pod \"18bae58b-c530-4053-add2-e0650c3fdbe5\" (UID: \"18bae58b-c530-4053-add2-e0650c3fdbe5\") " Mar 13 01:48:25.489219 master-0 kubenswrapper[19170]: I0313 01:48:25.488926 19170 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-scripts\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:25.492104 master-0 kubenswrapper[19170]: I0313 01:48:25.491855 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd" (OuterVolumeSpecName: "kube-api-access-jpgtd") pod "18bae58b-c530-4053-add2-e0650c3fdbe5" (UID: "18bae58b-c530-4053-add2-e0650c3fdbe5"). InnerVolumeSpecName "kube-api-access-jpgtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:25.514645 master-0 kubenswrapper[19170]: I0313 01:48:25.514594 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18bae58b-c530-4053-add2-e0650c3fdbe5" (UID: "18bae58b-c530-4053-add2-e0650c3fdbe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:25.541678 master-0 kubenswrapper[19170]: I0313 01:48:25.541457 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data" (OuterVolumeSpecName: "config-data") pod "18bae58b-c530-4053-add2-e0650c3fdbe5" (UID: "18bae58b-c530-4053-add2-e0650c3fdbe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:25.592448 master-0 kubenswrapper[19170]: I0313 01:48:25.592399 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:25.592448 master-0 kubenswrapper[19170]: I0313 01:48:25.592452 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18bae58b-c530-4053-add2-e0650c3fdbe5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:25.592592 master-0 kubenswrapper[19170]: I0313 01:48:25.592469 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpgtd\" (UniqueName: \"kubernetes.io/projected/18bae58b-c530-4053-add2-e0650c3fdbe5-kube-api-access-jpgtd\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:25.818783 master-0 kubenswrapper[19170]: I0313 01:48:25.813881 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hn6dr" event={"ID":"18bae58b-c530-4053-add2-e0650c3fdbe5","Type":"ContainerDied","Data":"787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5"} Mar 13 01:48:25.818783 master-0 kubenswrapper[19170]: I0313 01:48:25.813953 19170 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="787d130b263585566b7a8f35861c6200cb451003d95493536941513d59dc64d5" Mar 13 01:48:25.845838 master-0 kubenswrapper[19170]: I0313 01:48:25.814068 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hn6dr" Mar 13 01:48:25.972392 master-0 kubenswrapper[19170]: I0313 01:48:25.972339 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 01:48:25.984413 master-0 kubenswrapper[19170]: I0313 01:48:25.984355 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 01:48:25.993767 master-0 kubenswrapper[19170]: I0313 01:48:25.993703 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 01:48:26.037740 master-0 kubenswrapper[19170]: I0313 01:48:26.037009 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:26.037740 master-0 kubenswrapper[19170]: I0313 01:48:26.037253 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-log" containerID="cri-o://2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" gracePeriod=30 Mar 13 01:48:26.037740 master-0 kubenswrapper[19170]: I0313 01:48:26.037386 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-api" containerID="cri-o://f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" gracePeriod=30 Mar 13 01:48:26.060809 master-0 kubenswrapper[19170]: I0313 01:48:26.060747 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:26.061031 master-0 kubenswrapper[19170]: I0313 01:48:26.061007 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04fabfa9-7306-4692-8301-9ea765f3a452" containerName="nova-scheduler-scheduler" containerID="cri-o://c315c477c537d361ab588a3b2eec0dc871612e75d623972f6446450f312f8d34" gracePeriod=30 Mar 13 01:48:26.103433 master-0 kubenswrapper[19170]: I0313 01:48:26.102893 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:26.759824 master-0 kubenswrapper[19170]: I0313 01:48:26.759747 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:26.838602 master-0 kubenswrapper[19170]: I0313 01:48:26.838535 19170 generic.go:334] "Generic (PLEG): container finished" podID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerID="f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" exitCode=0 Mar 13 01:48:26.838602 master-0 kubenswrapper[19170]: I0313 01:48:26.838576 19170 generic.go:334] "Generic (PLEG): container finished" podID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerID="2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" exitCode=143 Mar 13 01:48:26.839056 master-0 kubenswrapper[19170]: I0313 01:48:26.839003 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerDied","Data":"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995"} Mar 13 01:48:26.839117 master-0 kubenswrapper[19170]: I0313 01:48:26.839062 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerDied","Data":"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a"} Mar 13 01:48:26.839117 master-0 kubenswrapper[19170]: I0313 01:48:26.839065 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:26.839117 master-0 kubenswrapper[19170]: I0313 01:48:26.839091 19170 scope.go:117] "RemoveContainer" containerID="f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" Mar 13 01:48:26.839266 master-0 kubenswrapper[19170]: I0313 01:48:26.839079 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"367f092d-6e58-4bd9-8451-6c162c45dec9","Type":"ContainerDied","Data":"931e0ae93d41b023f245c4f63dadf7dd98b2eef761c484e18c993af1e8300c39"} Mar 13 01:48:26.846763 master-0 kubenswrapper[19170]: I0313 01:48:26.846706 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 01:48:26.865285 master-0 kubenswrapper[19170]: I0313 01:48:26.865233 19170 scope.go:117] "RemoveContainer" containerID="2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" Mar 13 01:48:26.916046 master-0 kubenswrapper[19170]: I0313 01:48:26.915191 19170 scope.go:117] "RemoveContainer" containerID="f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: E0313 01:48:26.918895 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995\": container with ID starting with f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995 not found: ID does not exist" containerID="f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.918946 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995"} err="failed to get container status \"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995\": rpc error: code = NotFound desc = could not find container \"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995\": container with ID starting with f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995 not found: ID does not exist" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.918975 19170 scope.go:117] "RemoveContainer" containerID="2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: E0313 01:48:26.922101 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a\": container with ID starting with 2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a not found: ID does not exist" containerID="2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.922139 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a"} err="failed to get container status \"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a\": rpc error: code = NotFound desc = could not find container \"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a\": container with ID starting with 2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a not found: ID does not exist" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.922162 19170 scope.go:117] "RemoveContainer" containerID="f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.922526 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995"} err="failed to get container status \"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995\": rpc error: code = NotFound desc = could not find container \"f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995\": container with ID starting with f91ae2c495e2d6f909049feb6b167db793ce841307ad54261152f8be9ff7e995 not found: ID does not exist" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.922544 19170 scope.go:117] "RemoveContainer" containerID="2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a" Mar 13 01:48:26.925665 master-0 kubenswrapper[19170]: I0313 01:48:26.922943 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a"} err="failed to get container status \"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a\": rpc error: code = NotFound desc = could not find container \"2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a\": container with ID starting with 2ce33c761fb90ed94e107a62c17a74b4a84e05e06985a7b9436e3cde6cb9b66a not found: ID does not exist" Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.932799 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.932886 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.932922 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.933001 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.933047 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k646t\" (UniqueName: \"kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.933466 master-0 kubenswrapper[19170]: I0313 01:48:26.933097 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs\") pod \"367f092d-6e58-4bd9-8451-6c162c45dec9\" (UID: \"367f092d-6e58-4bd9-8451-6c162c45dec9\") " Mar 13 01:48:26.934576 master-0 kubenswrapper[19170]: I0313 01:48:26.934405 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs" (OuterVolumeSpecName: "logs") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:48:26.937146 master-0 kubenswrapper[19170]: I0313 01:48:26.937064 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367f092d-6e58-4bd9-8451-6c162c45dec9-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:26.970148 master-0 kubenswrapper[19170]: I0313 01:48:26.966922 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data" (OuterVolumeSpecName: "config-data") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:26.970148 master-0 kubenswrapper[19170]: I0313 01:48:26.968981 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t" (OuterVolumeSpecName: "kube-api-access-k646t") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "kube-api-access-k646t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:26.984762 master-0 kubenswrapper[19170]: I0313 01:48:26.984703 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:26.999590 master-0 kubenswrapper[19170]: I0313 01:48:26.999539 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:27.014145 master-0 kubenswrapper[19170]: I0313 01:48:27.014102 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "367f092d-6e58-4bd9-8451-6c162c45dec9" (UID: "367f092d-6e58-4bd9-8451-6c162c45dec9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:27.040399 master-0 kubenswrapper[19170]: I0313 01:48:27.040339 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:27.040399 master-0 kubenswrapper[19170]: I0313 01:48:27.040383 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k646t\" (UniqueName: \"kubernetes.io/projected/367f092d-6e58-4bd9-8451-6c162c45dec9-kube-api-access-k646t\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:27.040399 master-0 kubenswrapper[19170]: I0313 01:48:27.040395 19170 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:27.040399 master-0 kubenswrapper[19170]: I0313 01:48:27.040406 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:27.040399 master-0 kubenswrapper[19170]: I0313 01:48:27.040417 19170 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/367f092d-6e58-4bd9-8451-6c162c45dec9-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:27.188320 master-0 kubenswrapper[19170]: I0313 01:48:27.187466 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:27.209830 master-0 kubenswrapper[19170]: I0313 01:48:27.209769 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:27.224600 master-0 kubenswrapper[19170]: I0313 01:48:27.224532 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:27.225306 master-0 kubenswrapper[19170]: E0313 01:48:27.225275 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afae2f07-106f-4fd9-a675-1dd722a9123f" containerName="nova-manage" Mar 13 01:48:27.225376 master-0 kubenswrapper[19170]: I0313 01:48:27.225310 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="afae2f07-106f-4fd9-a675-1dd722a9123f" containerName="nova-manage" Mar 13 01:48:27.225376 master-0 kubenswrapper[19170]: E0313 01:48:27.225333 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-log" Mar 13 01:48:27.225376 master-0 kubenswrapper[19170]: I0313 01:48:27.225344 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-log" Mar 13 01:48:27.225467 master-0 kubenswrapper[19170]: E0313 01:48:27.225378 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="dnsmasq-dns" Mar 13 01:48:27.225467 master-0 kubenswrapper[19170]: I0313 01:48:27.225395 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="dnsmasq-dns" Mar 13 01:48:27.225467 master-0 kubenswrapper[19170]: E0313 01:48:27.225418 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-api" Mar 13 01:48:27.225467 master-0 kubenswrapper[19170]: I0313 01:48:27.225429 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-api" Mar 13 01:48:27.225467 master-0 kubenswrapper[19170]: E0313 01:48:27.225463 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="init" Mar 13 01:48:27.225622 master-0 kubenswrapper[19170]: I0313 01:48:27.225475 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="init" Mar 13 01:48:27.225622 master-0 kubenswrapper[19170]: E0313 01:48:27.225532 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bae58b-c530-4053-add2-e0650c3fdbe5" containerName="nova-manage" Mar 13 01:48:27.225622 master-0 kubenswrapper[19170]: I0313 01:48:27.225545 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bae58b-c530-4053-add2-e0650c3fdbe5" containerName="nova-manage" Mar 13 01:48:27.225983 master-0 kubenswrapper[19170]: I0313 01:48:27.225946 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b8e39a7-f183-4541-82e3-fdcdc6936300" containerName="dnsmasq-dns" Mar 13 01:48:27.226031 master-0 kubenswrapper[19170]: I0313 01:48:27.226004 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bae58b-c530-4053-add2-e0650c3fdbe5" containerName="nova-manage" Mar 13 01:48:27.226031 master-0 kubenswrapper[19170]: I0313 01:48:27.226020 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-log" Mar 13 01:48:27.226108 master-0 kubenswrapper[19170]: I0313 01:48:27.226077 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="afae2f07-106f-4fd9-a675-1dd722a9123f" containerName="nova-manage" Mar 13 01:48:27.226144 master-0 kubenswrapper[19170]: I0313 01:48:27.226107 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" containerName="nova-api-api" Mar 13 01:48:27.228144 master-0 kubenswrapper[19170]: I0313 01:48:27.228102 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:27.230651 master-0 kubenswrapper[19170]: I0313 01:48:27.230431 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 01:48:27.230651 master-0 kubenswrapper[19170]: I0313 01:48:27.230483 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 01:48:27.231865 master-0 kubenswrapper[19170]: I0313 01:48:27.230765 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 01:48:27.236456 master-0 kubenswrapper[19170]: I0313 01:48:27.236416 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.245976 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e000e71b-9e22-4aa0-889b-c3d5a386476a-logs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.246057 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.246142 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.246287 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-config-data\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.246537 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.266026 master-0 kubenswrapper[19170]: I0313 01:48:27.246606 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqjnp\" (UniqueName: \"kubernetes.io/projected/e000e71b-9e22-4aa0-889b-c3d5a386476a-kube-api-access-gqjnp\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.348874 master-0 kubenswrapper[19170]: I0313 01:48:27.348814 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.348874 master-0 kubenswrapper[19170]: I0313 01:48:27.348884 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.349124 master-0 kubenswrapper[19170]: I0313 01:48:27.348932 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-config-data\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.349124 master-0 kubenswrapper[19170]: I0313 01:48:27.349009 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.349124 master-0 kubenswrapper[19170]: I0313 01:48:27.349035 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqjnp\" (UniqueName: \"kubernetes.io/projected/e000e71b-9e22-4aa0-889b-c3d5a386476a-kube-api-access-gqjnp\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.349124 master-0 kubenswrapper[19170]: I0313 01:48:27.349110 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e000e71b-9e22-4aa0-889b-c3d5a386476a-logs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.351258 master-0 kubenswrapper[19170]: I0313 01:48:27.350595 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e000e71b-9e22-4aa0-889b-c3d5a386476a-logs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.355540 master-0 kubenswrapper[19170]: I0313 01:48:27.353152 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.355540 master-0 kubenswrapper[19170]: I0313 01:48:27.353446 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.355540 master-0 kubenswrapper[19170]: I0313 01:48:27.353703 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-config-data\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.355540 master-0 kubenswrapper[19170]: I0313 01:48:27.354010 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e000e71b-9e22-4aa0-889b-c3d5a386476a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.368786 master-0 kubenswrapper[19170]: I0313 01:48:27.368753 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqjnp\" (UniqueName: \"kubernetes.io/projected/e000e71b-9e22-4aa0-889b-c3d5a386476a-kube-api-access-gqjnp\") pod \"nova-api-0\" (UID: \"e000e71b-9e22-4aa0-889b-c3d5a386476a\") " pod="openstack/nova-api-0" Mar 13 01:48:27.443121 master-0 kubenswrapper[19170]: I0313 01:48:27.442944 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367f092d-6e58-4bd9-8451-6c162c45dec9" path="/var/lib/kubelet/pods/367f092d-6e58-4bd9-8451-6c162c45dec9/volumes" Mar 13 01:48:27.608142 master-0 kubenswrapper[19170]: I0313 01:48:27.608072 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 01:48:27.851671 master-0 kubenswrapper[19170]: I0313 01:48:27.851386 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" containerID="cri-o://0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5" gracePeriod=30 Mar 13 01:48:27.851671 master-0 kubenswrapper[19170]: I0313 01:48:27.851429 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" containerID="cri-o://4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6" gracePeriod=30 Mar 13 01:48:28.143366 master-0 kubenswrapper[19170]: W0313 01:48:28.143308 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode000e71b_9e22_4aa0_889b_c3d5a386476a.slice/crio-1f00d08f12ea2840ef1a44b99209265c059be347a49d39bbcbe77b0235a46339 WatchSource:0}: Error finding container 1f00d08f12ea2840ef1a44b99209265c059be347a49d39bbcbe77b0235a46339: Status 404 returned error can't find the container with id 1f00d08f12ea2840ef1a44b99209265c059be347a49d39bbcbe77b0235a46339 Mar 13 01:48:28.149567 master-0 kubenswrapper[19170]: I0313 01:48:28.149463 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 01:48:28.882043 master-0 kubenswrapper[19170]: I0313 01:48:28.881853 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e000e71b-9e22-4aa0-889b-c3d5a386476a","Type":"ContainerStarted","Data":"e6828b06d663eeafd4194c3b568acdfb0ed0b343f07af44db93287551a41682b"} Mar 13 01:48:28.882043 master-0 kubenswrapper[19170]: I0313 01:48:28.881983 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e000e71b-9e22-4aa0-889b-c3d5a386476a","Type":"ContainerStarted","Data":"bd988c63614ae0f4e44139273bd25a2574b6f8574975d022e66ca28bf929ae75"} Mar 13 01:48:28.882704 master-0 kubenswrapper[19170]: I0313 01:48:28.882059 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e000e71b-9e22-4aa0-889b-c3d5a386476a","Type":"ContainerStarted","Data":"1f00d08f12ea2840ef1a44b99209265c059be347a49d39bbcbe77b0235a46339"} Mar 13 01:48:28.885266 master-0 kubenswrapper[19170]: I0313 01:48:28.884682 19170 generic.go:334] "Generic (PLEG): container finished" podID="598c8651-f44d-4307-bdea-832b32b84007" containerID="0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5" exitCode=143 Mar 13 01:48:28.885266 master-0 kubenswrapper[19170]: I0313 01:48:28.884752 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerDied","Data":"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5"} Mar 13 01:48:28.937066 master-0 kubenswrapper[19170]: I0313 01:48:28.936981 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.936958621 podStartE2EDuration="1.936958621s" podCreationTimestamp="2026-03-13 01:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:28.905106901 +0000 UTC m=+1769.713227921" watchObservedRunningTime="2026-03-13 01:48:28.936958621 +0000 UTC m=+1769.745079591" Mar 13 01:48:29.900863 master-0 kubenswrapper[19170]: I0313 01:48:29.899201 19170 generic.go:334] "Generic (PLEG): container finished" podID="04fabfa9-7306-4692-8301-9ea765f3a452" containerID="c315c477c537d361ab588a3b2eec0dc871612e75d623972f6446450f312f8d34" exitCode=0 Mar 13 01:48:29.900863 master-0 kubenswrapper[19170]: I0313 01:48:29.899981 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04fabfa9-7306-4692-8301-9ea765f3a452","Type":"ContainerDied","Data":"c315c477c537d361ab588a3b2eec0dc871612e75d623972f6446450f312f8d34"} Mar 13 01:48:30.060008 master-0 kubenswrapper[19170]: I0313 01:48:30.059952 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:48:30.245621 master-0 kubenswrapper[19170]: I0313 01:48:30.245565 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data\") pod \"04fabfa9-7306-4692-8301-9ea765f3a452\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " Mar 13 01:48:30.245943 master-0 kubenswrapper[19170]: I0313 01:48:30.245894 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle\") pod \"04fabfa9-7306-4692-8301-9ea765f3a452\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " Mar 13 01:48:30.246019 master-0 kubenswrapper[19170]: I0313 01:48:30.245956 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wqz\" (UniqueName: \"kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz\") pod \"04fabfa9-7306-4692-8301-9ea765f3a452\" (UID: \"04fabfa9-7306-4692-8301-9ea765f3a452\") " Mar 13 01:48:30.252722 master-0 kubenswrapper[19170]: I0313 01:48:30.252613 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz" (OuterVolumeSpecName: "kube-api-access-59wqz") pod "04fabfa9-7306-4692-8301-9ea765f3a452" (UID: "04fabfa9-7306-4692-8301-9ea765f3a452"). InnerVolumeSpecName "kube-api-access-59wqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:30.279397 master-0 kubenswrapper[19170]: I0313 01:48:30.279318 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04fabfa9-7306-4692-8301-9ea765f3a452" (UID: "04fabfa9-7306-4692-8301-9ea765f3a452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:30.305594 master-0 kubenswrapper[19170]: I0313 01:48:30.305523 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data" (OuterVolumeSpecName: "config-data") pod "04fabfa9-7306-4692-8301-9ea765f3a452" (UID: "04fabfa9-7306-4692-8301-9ea765f3a452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:30.349700 master-0 kubenswrapper[19170]: I0313 01:48:30.349544 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:30.349700 master-0 kubenswrapper[19170]: I0313 01:48:30.349612 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04fabfa9-7306-4692-8301-9ea765f3a452-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:30.349700 master-0 kubenswrapper[19170]: I0313 01:48:30.349661 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wqz\" (UniqueName: \"kubernetes.io/projected/04fabfa9-7306-4692-8301-9ea765f3a452-kube-api-access-59wqz\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:30.916823 master-0 kubenswrapper[19170]: I0313 01:48:30.916760 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04fabfa9-7306-4692-8301-9ea765f3a452","Type":"ContainerDied","Data":"875f42b1da0d52586671c14734f17421d02b6667ff94389b4c44aa55bedb42bf"} Mar 13 01:48:30.917557 master-0 kubenswrapper[19170]: I0313 01:48:30.916865 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:48:30.917697 master-0 kubenswrapper[19170]: I0313 01:48:30.917513 19170 scope.go:117] "RemoveContainer" containerID="c315c477c537d361ab588a3b2eec0dc871612e75d623972f6446450f312f8d34" Mar 13 01:48:30.985473 master-0 kubenswrapper[19170]: I0313 01:48:30.981761 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:31.010460 master-0 kubenswrapper[19170]: I0313 01:48:31.008720 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.025718 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: E0313 01:48:31.026485 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fabfa9-7306-4692-8301-9ea765f3a452" containerName="nova-scheduler-scheduler" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.026507 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fabfa9-7306-4692-8301-9ea765f3a452" containerName="nova-scheduler-scheduler" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.026912 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fabfa9-7306-4692-8301-9ea765f3a452" containerName="nova-scheduler-scheduler" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.027974 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.039024 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.039283 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": read tcp 10.128.0.2:34308->10.128.1.19:8775: read: connection reset by peer" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.039754 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 01:48:31.042685 master-0 kubenswrapper[19170]: I0313 01:48:31.040721 19170 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": read tcp 10.128.0.2:34312->10.128.1.19:8775: read: connection reset by peer" Mar 13 01:48:31.172589 master-0 kubenswrapper[19170]: I0313 01:48:31.171984 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vccvg\" (UniqueName: \"kubernetes.io/projected/3e472963-c028-416e-9256-15c87cd45e3f-kube-api-access-vccvg\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.172589 master-0 kubenswrapper[19170]: I0313 01:48:31.172361 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.172874 master-0 kubenswrapper[19170]: I0313 01:48:31.172673 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-config-data\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.281040 master-0 kubenswrapper[19170]: I0313 01:48:31.278875 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.281040 master-0 kubenswrapper[19170]: I0313 01:48:31.279043 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-config-data\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.281040 master-0 kubenswrapper[19170]: I0313 01:48:31.279221 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vccvg\" (UniqueName: \"kubernetes.io/projected/3e472963-c028-416e-9256-15c87cd45e3f-kube-api-access-vccvg\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.292044 master-0 kubenswrapper[19170]: I0313 01:48:31.291920 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.299915 master-0 kubenswrapper[19170]: I0313 01:48:31.299848 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e472963-c028-416e-9256-15c87cd45e3f-config-data\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.307232 master-0 kubenswrapper[19170]: I0313 01:48:31.307165 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vccvg\" (UniqueName: \"kubernetes.io/projected/3e472963-c028-416e-9256-15c87cd45e3f-kube-api-access-vccvg\") pod \"nova-scheduler-0\" (UID: \"3e472963-c028-416e-9256-15c87cd45e3f\") " pod="openstack/nova-scheduler-0" Mar 13 01:48:31.446022 master-0 kubenswrapper[19170]: I0313 01:48:31.445821 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fabfa9-7306-4692-8301-9ea765f3a452" path="/var/lib/kubelet/pods/04fabfa9-7306-4692-8301-9ea765f3a452/volumes" Mar 13 01:48:31.449030 master-0 kubenswrapper[19170]: I0313 01:48:31.448982 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 01:48:31.589531 master-0 kubenswrapper[19170]: I0313 01:48:31.589489 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.696405 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data\") pod \"598c8651-f44d-4307-bdea-832b32b84007\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.696474 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs\") pod \"598c8651-f44d-4307-bdea-832b32b84007\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.696565 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle\") pod \"598c8651-f44d-4307-bdea-832b32b84007\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.696597 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs\") pod \"598c8651-f44d-4307-bdea-832b32b84007\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.696684 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lm9v\" (UniqueName: \"kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v\") pod \"598c8651-f44d-4307-bdea-832b32b84007\" (UID: \"598c8651-f44d-4307-bdea-832b32b84007\") " Mar 13 01:48:31.698712 master-0 kubenswrapper[19170]: I0313 01:48:31.698227 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs" (OuterVolumeSpecName: "logs") pod "598c8651-f44d-4307-bdea-832b32b84007" (UID: "598c8651-f44d-4307-bdea-832b32b84007"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 01:48:31.702251 master-0 kubenswrapper[19170]: I0313 01:48:31.701327 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v" (OuterVolumeSpecName: "kube-api-access-2lm9v") pod "598c8651-f44d-4307-bdea-832b32b84007" (UID: "598c8651-f44d-4307-bdea-832b32b84007"). InnerVolumeSpecName "kube-api-access-2lm9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:48:31.726790 master-0 kubenswrapper[19170]: I0313 01:48:31.726685 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "598c8651-f44d-4307-bdea-832b32b84007" (UID: "598c8651-f44d-4307-bdea-832b32b84007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:31.731095 master-0 kubenswrapper[19170]: I0313 01:48:31.730953 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data" (OuterVolumeSpecName: "config-data") pod "598c8651-f44d-4307-bdea-832b32b84007" (UID: "598c8651-f44d-4307-bdea-832b32b84007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:31.794491 master-0 kubenswrapper[19170]: I0313 01:48:31.794410 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "598c8651-f44d-4307-bdea-832b32b84007" (UID: "598c8651-f44d-4307-bdea-832b32b84007"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:48:31.799165 master-0 kubenswrapper[19170]: I0313 01:48:31.799056 19170 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-config-data\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:31.799165 master-0 kubenswrapper[19170]: I0313 01:48:31.799101 19170 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:31.799165 master-0 kubenswrapper[19170]: I0313 01:48:31.799119 19170 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/598c8651-f44d-4307-bdea-832b32b84007-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:31.799165 master-0 kubenswrapper[19170]: I0313 01:48:31.799132 19170 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/598c8651-f44d-4307-bdea-832b32b84007-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:31.799165 master-0 kubenswrapper[19170]: I0313 01:48:31.799150 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lm9v\" (UniqueName: \"kubernetes.io/projected/598c8651-f44d-4307-bdea-832b32b84007-kube-api-access-2lm9v\") on node \"master-0\" DevicePath \"\"" Mar 13 01:48:31.944763 master-0 kubenswrapper[19170]: I0313 01:48:31.944701 19170 generic.go:334] "Generic (PLEG): container finished" podID="598c8651-f44d-4307-bdea-832b32b84007" containerID="4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6" exitCode=0 Mar 13 01:48:31.944763 master-0 kubenswrapper[19170]: I0313 01:48:31.944757 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerDied","Data":"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6"} Mar 13 01:48:31.945290 master-0 kubenswrapper[19170]: I0313 01:48:31.944788 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"598c8651-f44d-4307-bdea-832b32b84007","Type":"ContainerDied","Data":"7a6e0b568be197364a8e0b61cd95d01dcb36d7720cd6efd9e93faccf807166fe"} Mar 13 01:48:31.945290 master-0 kubenswrapper[19170]: I0313 01:48:31.944807 19170 scope.go:117] "RemoveContainer" containerID="4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6" Mar 13 01:48:31.945290 master-0 kubenswrapper[19170]: I0313 01:48:31.944948 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:31.985652 master-0 kubenswrapper[19170]: I0313 01:48:31.985569 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 01:48:32.004249 master-0 kubenswrapper[19170]: I0313 01:48:32.004153 19170 scope.go:117] "RemoveContainer" containerID="0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5" Mar 13 01:48:32.007305 master-0 kubenswrapper[19170]: I0313 01:48:32.006795 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:32.069843 master-0 kubenswrapper[19170]: W0313 01:48:32.069782 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e472963_c028_416e_9256_15c87cd45e3f.slice/crio-afed8f3683c5b0af8dcd8e2b7b723b618f6cce407ac188cb254f2f2c560258da WatchSource:0}: Error finding container afed8f3683c5b0af8dcd8e2b7b723b618f6cce407ac188cb254f2f2c560258da: Status 404 returned error can't find the container with id afed8f3683c5b0af8dcd8e2b7b723b618f6cce407ac188cb254f2f2c560258da Mar 13 01:48:32.083546 master-0 kubenswrapper[19170]: I0313 01:48:32.083408 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:32.083615 master-0 kubenswrapper[19170]: I0313 01:48:32.083580 19170 scope.go:117] "RemoveContainer" containerID="4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6" Mar 13 01:48:32.084093 master-0 kubenswrapper[19170]: E0313 01:48:32.084055 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6\": container with ID starting with 4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6 not found: ID does not exist" containerID="4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6" Mar 13 01:48:32.084163 master-0 kubenswrapper[19170]: I0313 01:48:32.084098 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6"} err="failed to get container status \"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6\": rpc error: code = NotFound desc = could not find container \"4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6\": container with ID starting with 4f3863ff9c2666d82760d69d01d1b7735a7b0ee872c1b99efb8eb9d281706fa6 not found: ID does not exist" Mar 13 01:48:32.084163 master-0 kubenswrapper[19170]: I0313 01:48:32.084130 19170 scope.go:117] "RemoveContainer" containerID="0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5" Mar 13 01:48:32.084536 master-0 kubenswrapper[19170]: E0313 01:48:32.084504 19170 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5\": container with ID starting with 0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5 not found: ID does not exist" containerID="0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5" Mar 13 01:48:32.084608 master-0 kubenswrapper[19170]: I0313 01:48:32.084537 19170 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5"} err="failed to get container status \"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5\": rpc error: code = NotFound desc = could not find container \"0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5\": container with ID starting with 0871168d0063712c94844acaa320a7fcb533f82e19f8c6607749c4b287241ff5 not found: ID does not exist" Mar 13 01:48:32.099169 master-0 kubenswrapper[19170]: I0313 01:48:32.099117 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:32.099767 master-0 kubenswrapper[19170]: E0313 01:48:32.099734 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" Mar 13 01:48:32.099767 master-0 kubenswrapper[19170]: I0313 01:48:32.099761 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" Mar 13 01:48:32.099860 master-0 kubenswrapper[19170]: E0313 01:48:32.099786 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" Mar 13 01:48:32.099860 master-0 kubenswrapper[19170]: I0313 01:48:32.099796 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" Mar 13 01:48:32.100143 master-0 kubenswrapper[19170]: I0313 01:48:32.100113 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-log" Mar 13 01:48:32.100182 master-0 kubenswrapper[19170]: I0313 01:48:32.100153 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="598c8651-f44d-4307-bdea-832b32b84007" containerName="nova-metadata-metadata" Mar 13 01:48:32.101881 master-0 kubenswrapper[19170]: I0313 01:48:32.101847 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:32.103446 master-0 kubenswrapper[19170]: I0313 01:48:32.103399 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 01:48:32.104275 master-0 kubenswrapper[19170]: I0313 01:48:32.104237 19170 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 01:48:32.113871 master-0 kubenswrapper[19170]: I0313 01:48:32.113815 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:32.228760 master-0 kubenswrapper[19170]: I0313 01:48:32.228621 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.228939 master-0 kubenswrapper[19170]: I0313 01:48:32.228817 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643d51fa-dae4-490e-a6ee-8bfa66a56855-logs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.228939 master-0 kubenswrapper[19170]: I0313 01:48:32.228876 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpx2s\" (UniqueName: \"kubernetes.io/projected/643d51fa-dae4-490e-a6ee-8bfa66a56855-kube-api-access-qpx2s\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.228939 master-0 kubenswrapper[19170]: I0313 01:48:32.228924 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-config-data\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.229095 master-0 kubenswrapper[19170]: I0313 01:48:32.229001 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.331489 master-0 kubenswrapper[19170]: I0313 01:48:32.331412 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.331750 master-0 kubenswrapper[19170]: I0313 01:48:32.331538 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.332372 master-0 kubenswrapper[19170]: I0313 01:48:32.332289 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643d51fa-dae4-490e-a6ee-8bfa66a56855-logs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.332548 master-0 kubenswrapper[19170]: I0313 01:48:32.332329 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/643d51fa-dae4-490e-a6ee-8bfa66a56855-logs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.332606 master-0 kubenswrapper[19170]: I0313 01:48:32.332541 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpx2s\" (UniqueName: \"kubernetes.io/projected/643d51fa-dae4-490e-a6ee-8bfa66a56855-kube-api-access-qpx2s\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.334023 master-0 kubenswrapper[19170]: I0313 01:48:32.333980 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-config-data\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.335251 master-0 kubenswrapper[19170]: I0313 01:48:32.335218 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.335251 master-0 kubenswrapper[19170]: I0313 01:48:32.335231 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.342952 master-0 kubenswrapper[19170]: I0313 01:48:32.342894 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/643d51fa-dae4-490e-a6ee-8bfa66a56855-config-data\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.353918 master-0 kubenswrapper[19170]: I0313 01:48:32.353841 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpx2s\" (UniqueName: \"kubernetes.io/projected/643d51fa-dae4-490e-a6ee-8bfa66a56855-kube-api-access-qpx2s\") pod \"nova-metadata-0\" (UID: \"643d51fa-dae4-490e-a6ee-8bfa66a56855\") " pod="openstack/nova-metadata-0" Mar 13 01:48:32.540915 master-0 kubenswrapper[19170]: I0313 01:48:32.540810 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 01:48:32.963944 master-0 kubenswrapper[19170]: I0313 01:48:32.963866 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e472963-c028-416e-9256-15c87cd45e3f","Type":"ContainerStarted","Data":"a9df04b643c6ea535d7b5dfc88b15708f9c3a8696ce319427d0e0739e3400062"} Mar 13 01:48:32.963944 master-0 kubenswrapper[19170]: I0313 01:48:32.963942 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3e472963-c028-416e-9256-15c87cd45e3f","Type":"ContainerStarted","Data":"afed8f3683c5b0af8dcd8e2b7b723b618f6cce407ac188cb254f2f2c560258da"} Mar 13 01:48:32.987491 master-0 kubenswrapper[19170]: I0313 01:48:32.987398 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9873770029999998 podStartE2EDuration="2.987377003s" podCreationTimestamp="2026-03-13 01:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:32.982006923 +0000 UTC m=+1773.790127893" watchObservedRunningTime="2026-03-13 01:48:32.987377003 +0000 UTC m=+1773.795497973" Mar 13 01:48:33.081309 master-0 kubenswrapper[19170]: W0313 01:48:33.081246 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod643d51fa_dae4_490e_a6ee_8bfa66a56855.slice/crio-968e78b26e73fed3c2dd55c0f22a2604356c58135d10a7e65e43238fce5c2733 WatchSource:0}: Error finding container 968e78b26e73fed3c2dd55c0f22a2604356c58135d10a7e65e43238fce5c2733: Status 404 returned error can't find the container with id 968e78b26e73fed3c2dd55c0f22a2604356c58135d10a7e65e43238fce5c2733 Mar 13 01:48:33.084395 master-0 kubenswrapper[19170]: I0313 01:48:33.084310 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 01:48:33.438993 master-0 kubenswrapper[19170]: I0313 01:48:33.438925 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598c8651-f44d-4307-bdea-832b32b84007" path="/var/lib/kubelet/pods/598c8651-f44d-4307-bdea-832b32b84007/volumes" Mar 13 01:48:33.981879 master-0 kubenswrapper[19170]: I0313 01:48:33.981590 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"643d51fa-dae4-490e-a6ee-8bfa66a56855","Type":"ContainerStarted","Data":"f7019a90b230fc479266bf4a5cbca753cae9074124007e5c853c201f4c4987a6"} Mar 13 01:48:33.981879 master-0 kubenswrapper[19170]: I0313 01:48:33.981677 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"643d51fa-dae4-490e-a6ee-8bfa66a56855","Type":"ContainerStarted","Data":"87b8cf68643aa8df34ff54d4c02320b00c0cb625fe0851a15f9403f05967c5cd"} Mar 13 01:48:33.981879 master-0 kubenswrapper[19170]: I0313 01:48:33.981697 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"643d51fa-dae4-490e-a6ee-8bfa66a56855","Type":"ContainerStarted","Data":"968e78b26e73fed3c2dd55c0f22a2604356c58135d10a7e65e43238fce5c2733"} Mar 13 01:48:34.030649 master-0 kubenswrapper[19170]: I0313 01:48:34.030561 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.030546278 podStartE2EDuration="3.030546278s" podCreationTimestamp="2026-03-13 01:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:48:34.015400185 +0000 UTC m=+1774.823521195" watchObservedRunningTime="2026-03-13 01:48:34.030546278 +0000 UTC m=+1774.838667238" Mar 13 01:48:36.449352 master-0 kubenswrapper[19170]: I0313 01:48:36.449270 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 01:48:37.542220 master-0 kubenswrapper[19170]: I0313 01:48:37.542124 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:48:37.542220 master-0 kubenswrapper[19170]: I0313 01:48:37.542224 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 01:48:37.608827 master-0 kubenswrapper[19170]: I0313 01:48:37.608732 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:48:37.608827 master-0 kubenswrapper[19170]: I0313 01:48:37.608834 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 01:48:38.623857 master-0 kubenswrapper[19170]: I0313 01:48:38.623781 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e000e71b-9e22-4aa0-889b-c3d5a386476a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.24:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:38.624368 master-0 kubenswrapper[19170]: I0313 01:48:38.624130 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e000e71b-9e22-4aa0-889b-c3d5a386476a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.24:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:41.449551 master-0 kubenswrapper[19170]: I0313 01:48:41.449498 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 01:48:41.497502 master-0 kubenswrapper[19170]: I0313 01:48:41.497416 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 01:48:42.160615 master-0 kubenswrapper[19170]: I0313 01:48:42.160391 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 01:48:42.542197 master-0 kubenswrapper[19170]: I0313 01:48:42.542059 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 01:48:42.542197 master-0 kubenswrapper[19170]: I0313 01:48:42.542149 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 01:48:43.559841 master-0 kubenswrapper[19170]: I0313 01:48:43.559773 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="643d51fa-dae4-490e-a6ee-8bfa66a56855" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.26:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:43.560307 master-0 kubenswrapper[19170]: I0313 01:48:43.559797 19170 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="643d51fa-dae4-490e-a6ee-8bfa66a56855" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.26:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 01:48:47.620615 master-0 kubenswrapper[19170]: I0313 01:48:47.620513 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 01:48:47.621853 master-0 kubenswrapper[19170]: I0313 01:48:47.621451 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 01:48:47.630273 master-0 kubenswrapper[19170]: I0313 01:48:47.630185 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 01:48:47.632261 master-0 kubenswrapper[19170]: I0313 01:48:47.632208 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 01:48:48.211321 master-0 kubenswrapper[19170]: I0313 01:48:48.211246 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 01:48:48.223137 master-0 kubenswrapper[19170]: I0313 01:48:48.223076 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 01:48:52.547872 master-0 kubenswrapper[19170]: I0313 01:48:52.547813 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 01:48:52.551272 master-0 kubenswrapper[19170]: I0313 01:48:52.551214 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 01:48:52.555765 master-0 kubenswrapper[19170]: I0313 01:48:52.555727 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 01:48:53.295761 master-0 kubenswrapper[19170]: I0313 01:48:53.295630 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 01:49:20.282370 master-0 kubenswrapper[19170]: I0313 01:49:20.282275 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:49:20.283549 master-0 kubenswrapper[19170]: I0313 01:49:20.282552 19170 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" podUID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" containerName="sushy-emulator" containerID="cri-o://2f7ea09545ce9bd992ab81bb3bf154ae230cc3b93405fbd6c9d47f0a87db6c9d" gracePeriod=30 Mar 13 01:49:20.770664 master-0 kubenswrapper[19170]: I0313 01:49:20.770568 19170 generic.go:334] "Generic (PLEG): container finished" podID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" containerID="2f7ea09545ce9bd992ab81bb3bf154ae230cc3b93405fbd6c9d47f0a87db6c9d" exitCode=0 Mar 13 01:49:20.770955 master-0 kubenswrapper[19170]: I0313 01:49:20.770661 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" event={"ID":"b3d69657-4bb5-4150-9376-e37c53ec5bf2","Type":"ContainerDied","Data":"2f7ea09545ce9bd992ab81bb3bf154ae230cc3b93405fbd6c9d47f0a87db6c9d"} Mar 13 01:49:21.135836 master-0 kubenswrapper[19170]: I0313 01:49:21.135289 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:49:21.251298 master-0 kubenswrapper[19170]: I0313 01:49:21.251242 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-6759f57b8c-pwxtv"] Mar 13 01:49:21.252139 master-0 kubenswrapper[19170]: E0313 01:49:21.251913 19170 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" containerName="sushy-emulator" Mar 13 01:49:21.252139 master-0 kubenswrapper[19170]: I0313 01:49:21.251932 19170 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" containerName="sushy-emulator" Mar 13 01:49:21.253615 master-0 kubenswrapper[19170]: I0313 01:49:21.253588 19170 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" containerName="sushy-emulator" Mar 13 01:49:21.254503 master-0 kubenswrapper[19170]: I0313 01:49:21.254459 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.263565 master-0 kubenswrapper[19170]: I0313 01:49:21.263509 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-6759f57b8c-pwxtv"] Mar 13 01:49:21.312769 master-0 kubenswrapper[19170]: I0313 01:49:21.312541 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config\") pod \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " Mar 13 01:49:21.312769 master-0 kubenswrapper[19170]: I0313 01:49:21.312744 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config\") pod \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " Mar 13 01:49:21.313453 master-0 kubenswrapper[19170]: I0313 01:49:21.313085 19170 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkw9d\" (UniqueName: \"kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d\") pod \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\" (UID: \"b3d69657-4bb5-4150-9376-e37c53ec5bf2\") " Mar 13 01:49:21.315894 master-0 kubenswrapper[19170]: I0313 01:49:21.315848 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "b3d69657-4bb5-4150-9376-e37c53ec5bf2" (UID: "b3d69657-4bb5-4150-9376-e37c53ec5bf2"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 01:49:21.318738 master-0 kubenswrapper[19170]: I0313 01:49:21.318677 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d" (OuterVolumeSpecName: "kube-api-access-lkw9d") pod "b3d69657-4bb5-4150-9376-e37c53ec5bf2" (UID: "b3d69657-4bb5-4150-9376-e37c53ec5bf2"). InnerVolumeSpecName "kube-api-access-lkw9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 01:49:21.319981 master-0 kubenswrapper[19170]: I0313 01:49:21.319941 19170 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "b3d69657-4bb5-4150-9376-e37c53ec5bf2" (UID: "b3d69657-4bb5-4150-9376-e37c53ec5bf2"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 01:49:21.417100 master-0 kubenswrapper[19170]: I0313 01:49:21.416965 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4333aaeb-1228-4f8e-9664-112c0841aace-sushy-emulator-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.417428 master-0 kubenswrapper[19170]: I0313 01:49:21.417121 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4333aaeb-1228-4f8e-9664-112c0841aace-os-client-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.417428 master-0 kubenswrapper[19170]: I0313 01:49:21.417213 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/4333aaeb-1228-4f8e-9664-112c0841aace-kube-api-access-94gv9\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.417428 master-0 kubenswrapper[19170]: I0313 01:49:21.417421 19170 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkw9d\" (UniqueName: \"kubernetes.io/projected/b3d69657-4bb5-4150-9376-e37c53ec5bf2-kube-api-access-lkw9d\") on node \"master-0\" DevicePath \"\"" Mar 13 01:49:21.417743 master-0 kubenswrapper[19170]: I0313 01:49:21.417441 19170 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b3d69657-4bb5-4150-9376-e37c53ec5bf2-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:49:21.417743 master-0 kubenswrapper[19170]: I0313 01:49:21.417454 19170 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b3d69657-4bb5-4150-9376-e37c53ec5bf2-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 13 01:49:21.521746 master-0 kubenswrapper[19170]: I0313 01:49:21.521658 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4333aaeb-1228-4f8e-9664-112c0841aace-sushy-emulator-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.521973 master-0 kubenswrapper[19170]: I0313 01:49:21.521826 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4333aaeb-1228-4f8e-9664-112c0841aace-os-client-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.521973 master-0 kubenswrapper[19170]: I0313 01:49:21.521941 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/4333aaeb-1228-4f8e-9664-112c0841aace-kube-api-access-94gv9\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.522543 master-0 kubenswrapper[19170]: I0313 01:49:21.522503 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/4333aaeb-1228-4f8e-9664-112c0841aace-sushy-emulator-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.530132 master-0 kubenswrapper[19170]: I0313 01:49:21.529503 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4333aaeb-1228-4f8e-9664-112c0841aace-os-client-config\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.539015 master-0 kubenswrapper[19170]: I0313 01:49:21.538972 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/4333aaeb-1228-4f8e-9664-112c0841aace-kube-api-access-94gv9\") pod \"sushy-emulator-6759f57b8c-pwxtv\" (UID: \"4333aaeb-1228-4f8e-9664-112c0841aace\") " pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.584003 master-0 kubenswrapper[19170]: I0313 01:49:21.583898 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:21.783623 master-0 kubenswrapper[19170]: I0313 01:49:21.783526 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" event={"ID":"b3d69657-4bb5-4150-9376-e37c53ec5bf2","Type":"ContainerDied","Data":"628343cacd9c5dca97375e171dd357059766e14203e16e04468cbfa4fa53e281"} Mar 13 01:49:21.783623 master-0 kubenswrapper[19170]: I0313 01:49:21.783569 19170 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-6dd6777c94-qlhhb" Mar 13 01:49:21.783623 master-0 kubenswrapper[19170]: I0313 01:49:21.783603 19170 scope.go:117] "RemoveContainer" containerID="2f7ea09545ce9bd992ab81bb3bf154ae230cc3b93405fbd6c9d47f0a87db6c9d" Mar 13 01:49:21.869172 master-0 kubenswrapper[19170]: I0313 01:49:21.869022 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:49:21.890149 master-0 kubenswrapper[19170]: I0313 01:49:21.890068 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-6dd6777c94-qlhhb"] Mar 13 01:49:22.164733 master-0 kubenswrapper[19170]: I0313 01:49:22.162928 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-6759f57b8c-pwxtv"] Mar 13 01:49:22.798469 master-0 kubenswrapper[19170]: I0313 01:49:22.798404 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" event={"ID":"4333aaeb-1228-4f8e-9664-112c0841aace","Type":"ContainerStarted","Data":"90ce6aa513c3fe553ef076c4d1824bd4a149b98b3273d0589491bd515fd97c83"} Mar 13 01:49:22.798469 master-0 kubenswrapper[19170]: I0313 01:49:22.798453 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" event={"ID":"4333aaeb-1228-4f8e-9664-112c0841aace","Type":"ContainerStarted","Data":"3a24f787490397bf63688c38d6209138816460ac59c5ca097af407c5dc8ae636"} Mar 13 01:49:22.824625 master-0 kubenswrapper[19170]: I0313 01:49:22.824556 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" podStartSLOduration=1.824540248 podStartE2EDuration="1.824540248s" podCreationTimestamp="2026-03-13 01:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 01:49:22.822325036 +0000 UTC m=+1823.630446006" watchObservedRunningTime="2026-03-13 01:49:22.824540248 +0000 UTC m=+1823.632661208" Mar 13 01:49:23.435434 master-0 kubenswrapper[19170]: I0313 01:49:23.435375 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d69657-4bb5-4150-9376-e37c53ec5bf2" path="/var/lib/kubelet/pods/b3d69657-4bb5-4150-9376-e37c53ec5bf2/volumes" Mar 13 01:49:31.585057 master-0 kubenswrapper[19170]: I0313 01:49:31.584932 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:31.585680 master-0 kubenswrapper[19170]: I0313 01:49:31.585512 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:31.600883 master-0 kubenswrapper[19170]: I0313 01:49:31.600827 19170 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:49:31.947654 master-0 kubenswrapper[19170]: I0313 01:49:31.947183 19170 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-6759f57b8c-pwxtv" Mar 13 01:51:06.046266 master-0 kubenswrapper[19170]: I0313 01:51:06.045959 19170 scope.go:117] "RemoveContainer" containerID="f30e075fdc163bf702368c63564fd8f9b7a7728988ba44dc27a1e1e7797a8b4b" Mar 13 01:51:06.109014 master-0 kubenswrapper[19170]: I0313 01:51:06.108967 19170 scope.go:117] "RemoveContainer" containerID="085e90ae6d978797e245149b313e40d43e85b75b81158858d76603f04cefddab" Mar 13 01:51:06.136806 master-0 kubenswrapper[19170]: I0313 01:51:06.136758 19170 scope.go:117] "RemoveContainer" containerID="d5fa74686248cc290070e6e4e8e692e240e2b869fad245862ba0fa15e0ba6388" Mar 13 01:52:06.282222 master-0 kubenswrapper[19170]: I0313 01:52:06.282154 19170 scope.go:117] "RemoveContainer" containerID="a64028f80ad60b57b1351892013ed1c1fbf2f482c5391221d690745b5adcc42a" Mar 13 01:54:06.556921 master-0 kubenswrapper[19170]: I0313 01:54:06.556842 19170 scope.go:117] "RemoveContainer" containerID="22e64e73b9493a33b71085e3943f6667dcc1cbc6d4efbd747a8e5e4c0a9e5094" Mar 13 01:54:06.593512 master-0 kubenswrapper[19170]: I0313 01:54:06.593467 19170 scope.go:117] "RemoveContainer" containerID="b83bca1065165f118c928d0c7ce24fa85522bddbc1c93c14c2dad378eb87d1b3" Mar 13 01:54:06.622892 master-0 kubenswrapper[19170]: I0313 01:54:06.622782 19170 scope.go:117] "RemoveContainer" containerID="ba45c2533f49b25b22ef9b62119062c185c04b577134457f4ed3e740cc303c42" Mar 13 01:54:30.096758 master-0 kubenswrapper[19170]: I0313 01:54:30.096691 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d458-account-create-update-g46pf"] Mar 13 01:54:30.112759 master-0 kubenswrapper[19170]: I0313 01:54:30.112704 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d458-account-create-update-g46pf"] Mar 13 01:54:31.034759 master-0 kubenswrapper[19170]: I0313 01:54:31.034705 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bk4qg"] Mar 13 01:54:31.046759 master-0 kubenswrapper[19170]: I0313 01:54:31.046706 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jwmhl"] Mar 13 01:54:31.059319 master-0 kubenswrapper[19170]: I0313 01:54:31.059242 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bk4qg"] Mar 13 01:54:31.070513 master-0 kubenswrapper[19170]: I0313 01:54:31.070446 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jwmhl"] Mar 13 01:54:31.438211 master-0 kubenswrapper[19170]: I0313 01:54:31.438126 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68585267-5574-4c61-98c1-9ddeb2015743" path="/var/lib/kubelet/pods/68585267-5574-4c61-98c1-9ddeb2015743/volumes" Mar 13 01:54:31.439843 master-0 kubenswrapper[19170]: I0313 01:54:31.439391 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95fca356-91a3-4e52-aa79-57ad8399523e" path="/var/lib/kubelet/pods/95fca356-91a3-4e52-aa79-57ad8399523e/volumes" Mar 13 01:54:31.443542 master-0 kubenswrapper[19170]: I0313 01:54:31.443488 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9f33b5a-2935-456d-9895-8fd0285664a5" path="/var/lib/kubelet/pods/a9f33b5a-2935-456d-9895-8fd0285664a5/volumes" Mar 13 01:54:32.060183 master-0 kubenswrapper[19170]: I0313 01:54:32.060070 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-msnk6"] Mar 13 01:54:32.074290 master-0 kubenswrapper[19170]: I0313 01:54:32.074200 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4e02-account-create-update-rvgbp"] Mar 13 01:54:32.088437 master-0 kubenswrapper[19170]: I0313 01:54:32.088321 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9356-account-create-update-vvlhq"] Mar 13 01:54:32.100644 master-0 kubenswrapper[19170]: I0313 01:54:32.100568 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-msnk6"] Mar 13 01:54:32.116538 master-0 kubenswrapper[19170]: I0313 01:54:32.116467 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9356-account-create-update-vvlhq"] Mar 13 01:54:32.126999 master-0 kubenswrapper[19170]: I0313 01:54:32.126924 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4e02-account-create-update-rvgbp"] Mar 13 01:54:33.438540 master-0 kubenswrapper[19170]: I0313 01:54:33.438409 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20df490e-cdef-41c3-b9c2-b9724a4d5ac9" path="/var/lib/kubelet/pods/20df490e-cdef-41c3-b9c2-b9724a4d5ac9/volumes" Mar 13 01:54:33.441815 master-0 kubenswrapper[19170]: I0313 01:54:33.441779 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb" path="/var/lib/kubelet/pods/86eb32e2-6a29-42bf-9b09-2f6cbe41a0eb/volumes" Mar 13 01:54:33.443437 master-0 kubenswrapper[19170]: I0313 01:54:33.442995 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35d0626-7620-40f1-b7ca-d6eef1cc775e" path="/var/lib/kubelet/pods/c35d0626-7620-40f1-b7ca-d6eef1cc775e/volumes" Mar 13 01:54:34.764246 master-0 kubenswrapper[19170]: I0313 01:54:34.764168 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9nq4g"] Mar 13 01:54:34.766331 master-0 kubenswrapper[19170]: I0313 01:54:34.766303 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:34.771265 master-0 kubenswrapper[19170]: I0313 01:54:34.771217 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6f9zs"/"kube-root-ca.crt" Mar 13 01:54:34.771494 master-0 kubenswrapper[19170]: I0313 01:54:34.771387 19170 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6f9zs"/"openshift-service-ca.crt" Mar 13 01:54:34.798878 master-0 kubenswrapper[19170]: I0313 01:54:34.798815 19170 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9dr9t"] Mar 13 01:54:34.801407 master-0 kubenswrapper[19170]: I0313 01:54:34.801373 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:34.823667 master-0 kubenswrapper[19170]: I0313 01:54:34.823589 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9nq4g"] Mar 13 01:54:34.846010 master-0 kubenswrapper[19170]: I0313 01:54:34.845960 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9dr9t"] Mar 13 01:54:34.886957 master-0 kubenswrapper[19170]: I0313 01:54:34.886797 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l59rr\" (UniqueName: \"kubernetes.io/projected/256d050a-1b69-4446-8431-27de31e69beb-kube-api-access-l59rr\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:34.886957 master-0 kubenswrapper[19170]: I0313 01:54:34.886917 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/256d050a-1b69-4446-8431-27de31e69beb-must-gather-output\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:34.989184 master-0 kubenswrapper[19170]: I0313 01:54:34.989118 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf4zt\" (UniqueName: \"kubernetes.io/projected/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-kube-api-access-cf4zt\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:34.990529 master-0 kubenswrapper[19170]: I0313 01:54:34.990098 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l59rr\" (UniqueName: \"kubernetes.io/projected/256d050a-1b69-4446-8431-27de31e69beb-kube-api-access-l59rr\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:34.990529 master-0 kubenswrapper[19170]: I0313 01:54:34.990305 19170 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-must-gather-output\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:34.990529 master-0 kubenswrapper[19170]: I0313 01:54:34.990496 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/256d050a-1b69-4446-8431-27de31e69beb-must-gather-output\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:34.990983 master-0 kubenswrapper[19170]: I0313 01:54:34.990942 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/256d050a-1b69-4446-8431-27de31e69beb-must-gather-output\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:35.010397 master-0 kubenswrapper[19170]: I0313 01:54:35.010337 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l59rr\" (UniqueName: \"kubernetes.io/projected/256d050a-1b69-4446-8431-27de31e69beb-kube-api-access-l59rr\") pod \"must-gather-9nq4g\" (UID: \"256d050a-1b69-4446-8431-27de31e69beb\") " pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:35.082973 master-0 kubenswrapper[19170]: I0313 01:54:35.082831 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6f9zs/must-gather-9nq4g" Mar 13 01:54:35.092323 master-0 kubenswrapper[19170]: I0313 01:54:35.092278 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf4zt\" (UniqueName: \"kubernetes.io/projected/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-kube-api-access-cf4zt\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:35.092482 master-0 kubenswrapper[19170]: I0313 01:54:35.092402 19170 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-must-gather-output\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:35.092944 master-0 kubenswrapper[19170]: I0313 01:54:35.092894 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-must-gather-output\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:35.118527 master-0 kubenswrapper[19170]: I0313 01:54:35.118468 19170 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf4zt\" (UniqueName: \"kubernetes.io/projected/3f0d1c1f-8471-40d7-b8c0-10d4a655e958-kube-api-access-cf4zt\") pod \"must-gather-9dr9t\" (UID: \"3f0d1c1f-8471-40d7-b8c0-10d4a655e958\") " pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:35.125712 master-0 kubenswrapper[19170]: I0313 01:54:35.125656 19170 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" Mar 13 01:54:35.657423 master-0 kubenswrapper[19170]: I0313 01:54:35.657288 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9nq4g"] Mar 13 01:54:35.660343 master-0 kubenswrapper[19170]: W0313 01:54:35.660275 19170 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod256d050a_1b69_4446_8431_27de31e69beb.slice/crio-d3fcfa962b5726a74c81ee98ae500191951410d349a4680b9d190f80acb4df10 WatchSource:0}: Error finding container d3fcfa962b5726a74c81ee98ae500191951410d349a4680b9d190f80acb4df10: Status 404 returned error can't find the container with id d3fcfa962b5726a74c81ee98ae500191951410d349a4680b9d190f80acb4df10 Mar 13 01:54:35.663284 master-0 kubenswrapper[19170]: I0313 01:54:35.663221 19170 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 01:54:35.794483 master-0 kubenswrapper[19170]: I0313 01:54:35.794408 19170 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6f9zs/must-gather-9dr9t"] Mar 13 01:54:36.244671 master-0 kubenswrapper[19170]: I0313 01:54:36.244576 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6f9zs/must-gather-9nq4g" event={"ID":"256d050a-1b69-4446-8431-27de31e69beb","Type":"ContainerStarted","Data":"d3fcfa962b5726a74c81ee98ae500191951410d349a4680b9d190f80acb4df10"} Mar 13 01:54:36.246764 master-0 kubenswrapper[19170]: I0313 01:54:36.246716 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" event={"ID":"3f0d1c1f-8471-40d7-b8c0-10d4a655e958","Type":"ContainerStarted","Data":"84fbe5c108da3861f302c16eb560efb96e616d02494980b08530fdc8442991d7"} Mar 13 01:54:38.244653 master-0 kubenswrapper[19170]: I0313 01:54:38.244558 19170 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-spk4p"] Mar 13 01:54:38.270241 master-0 kubenswrapper[19170]: I0313 01:54:38.270173 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" event={"ID":"3f0d1c1f-8471-40d7-b8c0-10d4a655e958","Type":"ContainerStarted","Data":"b33cfdab198391bcfba48f77163e6dccd49549b7a00849de079aaf38bf286a46"} Mar 13 01:54:38.270241 master-0 kubenswrapper[19170]: I0313 01:54:38.270226 19170 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" event={"ID":"3f0d1c1f-8471-40d7-b8c0-10d4a655e958","Type":"ContainerStarted","Data":"f3fa12b012ca8811aeeb646f9e56e04c9d080d75bb0fe114f626a429495e0194"} Mar 13 01:54:38.500005 master-0 kubenswrapper[19170]: I0313 01:54:38.499793 19170 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-spk4p"] Mar 13 01:54:38.540179 master-0 kubenswrapper[19170]: I0313 01:54:38.540013 19170 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6f9zs/must-gather-9dr9t" podStartSLOduration=3.215110647 podStartE2EDuration="4.539969744s" podCreationTimestamp="2026-03-13 01:54:34 +0000 UTC" firstStartedPulling="2026-03-13 01:54:35.796611781 +0000 UTC m=+2136.604732741" lastFinishedPulling="2026-03-13 01:54:37.121470888 +0000 UTC m=+2137.929591838" observedRunningTime="2026-03-13 01:54:38.522712469 +0000 UTC m=+2139.330833489" watchObservedRunningTime="2026-03-13 01:54:38.539969744 +0000 UTC m=+2139.348090714" Mar 13 01:54:39.442545 master-0 kubenswrapper[19170]: I0313 01:54:39.442465 19170 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63edfcff-8055-449f-9dd1-3d210972a805" path="/var/lib/kubelet/pods/63edfcff-8055-449f-9dd1-3d210972a805/volumes" Mar 13 01:54:40.991967 master-0 kubenswrapper[19170]: I0313 01:54:40.991890 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-4d6fw_85149f21-7ba8-4891-82ef-0fef3d5d7863/cluster-version-operator/0.log" Mar 13 01:54:44.086192 master-0 kubenswrapper[19170]: I0313 01:54:44.086155 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-rcf6z_82e70dc1-5983-4831-9b27-9771974d4f47/nmstate-console-plugin/0.log" Mar 13 01:54:44.154696 master-0 kubenswrapper[19170]: I0313 01:54:44.154621 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-72q4d_87f190ab-00ff-47fd-8392-3185fc8bab6f/nmstate-handler/0.log" Mar 13 01:54:44.178413 master-0 kubenswrapper[19170]: I0313 01:54:44.178361 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-g2t7x_5c56b3e6-e84b-4dae-8005-3d0af50aadfb/nmstate-metrics/0.log" Mar 13 01:54:44.189720 master-0 kubenswrapper[19170]: I0313 01:54:44.188618 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-g2t7x_5c56b3e6-e84b-4dae-8005-3d0af50aadfb/kube-rbac-proxy/0.log" Mar 13 01:54:44.231533 master-0 kubenswrapper[19170]: I0313 01:54:44.230426 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-25zp4_1e6870fc-49d7-41a3-b757-2c0eb0afb67d/nmstate-operator/0.log" Mar 13 01:54:44.267333 master-0 kubenswrapper[19170]: I0313 01:54:44.267297 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-mgg76_28e38789-7aec-4807-bab0-f2cf3f316573/nmstate-webhook/0.log" Mar 13 01:54:44.311651 master-0 kubenswrapper[19170]: I0313 01:54:44.310619 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-667wg_fd247073-2d90-4297-b745-d3b906c5f27d/controller/0.log" Mar 13 01:54:44.334681 master-0 kubenswrapper[19170]: I0313 01:54:44.333464 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-667wg_fd247073-2d90-4297-b745-d3b906c5f27d/kube-rbac-proxy/0.log" Mar 13 01:54:44.376854 master-0 kubenswrapper[19170]: I0313 01:54:44.376807 19170 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-pfmr9_1ef92c3a-7b62-42e8-909b-1cadf7157035/controller/0.log"